BNAS v2: Learning Architectures for Binary Networks with Empirical
Improvements
- URL: http://arxiv.org/abs/2110.08562v1
- Date: Sat, 16 Oct 2021 12:38:26 GMT
- Title: BNAS v2: Learning Architectures for Binary Networks with Empirical
Improvements
- Authors: Dahyun Kim, Kunal Pratap Singh, Jonghyun Choi
- Abstract summary: Backbone architectures of most binary networks are well-known floating point (FP) architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our method searches architectures with stable training curves despite the quantization error inherent in binary networks.
- Score: 11.978082858160576
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Backbone architectures of most binary networks are well-known floating point
(FP) architectures such as the ResNet family. Questioning that the
architectures designed for FP networks might not be the best for binary
networks, we propose to search architectures for binary networks (BNAS) by
defining a new search space for binary architectures and a novel search
objective. Specifically, based on the cell based search method, we define the
new search space of binary layer types, design a new cell template, and
rediscover the utility of and propose to use the Zeroise layer instead of using
it as a placeholder. The novel search objective diversifies early search to
learn better performing binary architectures. We show that our method searches
architectures with stable training curves despite the quantization error
inherent in binary networks. Quantitative analyses demonstrate that our
searched architectures outperform the architectures used in state-of-the-art
binary networks and outperform or perform on par with state-of-the-art binary
networks that employ various techniques other than architectural changes. In
addition, we further propose improvements to the training scheme of our
searched architectures. With the new training scheme for our searched
architectures, we achieve the state-of-the-art performance by binary networks
by outperforming all previous methods by non-trivial margins.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures [21.671696519808226]
Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency.
Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs.
This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space.
arXiv Detail & Related papers (2020-11-21T14:38:44Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Multi-Objective Neural Architecture Search Based on Diverse Structures
and Adaptive Recommendation [4.595675084986132]
The search space of neural architecture search (NAS) for convolutional neural network (CNN) is huge.
We propose MoARR algorithm, which utilizes the existing research results and historical information to quickly find architectures that are both lightweight and accurate.
Experimental results show that our MoARR can achieve a powerful and lightweight model (with 1.9% error rate and 2.3M parameters) on CIFAR-10 in 6 GPU hours.
arXiv Detail & Related papers (2020-07-06T13:42:33Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - BATS: Binary ArchitecTure Search [56.87581500474093]
We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
arXiv Detail & Related papers (2020-03-03T18:57:02Z) - Learning Architectures for Binary Networks [10.944100369283483]
Backbone architectures of most binary networks are well-known floating point architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our proposed method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2020-02-17T14:06:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.