BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures
- URL: http://arxiv.org/abs/2011.10804v3
- Date: Sat, 27 Mar 2021 05:54:26 GMT
- Title: BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures
- Authors: Tianchen Zhao, Xuefei Ning, Xiangsheng Shi, Songyi Yang, Shuang Liang,
Peng Lei, Jianfei Chen, Huazhong Yang, Yu Wang
- Abstract summary: Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency.
Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs.
This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space.
- Score: 21.671696519808226
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Binary Neural Networks (BNNs) have received significant attention due to
their promising efficiency. Currently, most BNN studies directly adopt
widely-used CNN architectures, which can be suboptimal for BNNs. This paper
proposes a novel Binary ARchitecture Search (BARS) flow to discover superior
binary architecture in a large design space. Specifically, we analyze the
information bottlenecks that are related to both the topology and layout
architecture design choices. And we propose to automatically search for the
optimal information flow. To achieve that, we design a two-level (Macro &
Micro) search space tailored for BNNs and apply a differentiable neural
architecture search (NAS) to explore this search space efficiently. The
macro-level search space includes width and depth decisions, which is required
for better balancing the model performance and complexity. We also design the
micro-level search space to strengthen the information flow for BNN. %A notable
challenge of BNN architecture search lies in that binary operations exacerbate
the "collapse" problem of differentiable NAS, for which we incorporate various
search and derive strategies to stabilize the search process. On CIFAR-10, BARS
achieves 1.5% higher accuracy with 2/3 binary operations and 1/10
floating-point operations comparing with existing BNN NAS studies. On ImageNet,
with similar resource consumption, BARS-discovered architecture achieves a 6%
accuracy gain than hand-crafted binary ResNet-18 architectures and outperforms
other binary architectures while fully binarizing the architecture backbone.
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - Flexible Channel Dimensions for Differentiable Architecture Search [50.33956216274694]
We propose a novel differentiable neural architecture search method with an efficient dynamic channel allocation algorithm.
We show that the proposed framework is able to find DNN architectures that are equivalent to previous methods in task accuracy and inference latency.
arXiv Detail & Related papers (2023-06-13T15:21:38Z) - BNAS v2: Learning Architectures for Binary Networks with Empirical
Improvements [11.978082858160576]
Backbone architectures of most binary networks are well-known floating point (FP) architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2021-10-16T12:38:26Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - NASB: Neural Architecture Search for Binary Convolutional Neural
Networks [2.3204178451683264]
We propose a strategy, named NASB, which adopts Neural Architecture Search (NAS) to find an optimal architecture for the binarization of CNNs.
Due to the flexibility of this automated strategy, the obtained architecture is not only suitable for binarization but also has low overhead.
NASB outperforms existing single and multiple binary CNNs by up to 4.0% and 1.0% Top-1 accuracy respectively.
arXiv Detail & Related papers (2020-08-08T13:06:11Z) - Multi-Objective Neural Architecture Search Based on Diverse Structures
and Adaptive Recommendation [4.595675084986132]
The search space of neural architecture search (NAS) for convolutional neural network (CNN) is huge.
We propose MoARR algorithm, which utilizes the existing research results and historical information to quickly find architectures that are both lightweight and accurate.
Experimental results show that our MoARR can achieve a powerful and lightweight model (with 1.9% error rate and 2.3M parameters) on CIFAR-10 in 6 GPU hours.
arXiv Detail & Related papers (2020-07-06T13:42:33Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - BATS: Binary ArchitecTure Search [56.87581500474093]
We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
arXiv Detail & Related papers (2020-03-03T18:57:02Z) - Learning Architectures for Binary Networks [10.944100369283483]
Backbone architectures of most binary networks are well-known floating point architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our proposed method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2020-02-17T14:06:45Z) - BNAS:An Efficient Neural Architecture Search Approach Using Broad
Scalable Architecture [62.587982139871976]
We propose Broad Neural Architecture Search (BNAS) where we elaborately design broad scalable architecture dubbed Broad Convolutional Neural Network (BCNN)
BNAS delivers 0.19 days which is 2.37x less expensive than ENAS who ranks the best in reinforcement learning-based NAS approaches.
arXiv Detail & Related papers (2020-01-18T15:07:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.