Learning Architectures for Binary Networks
- URL: http://arxiv.org/abs/2002.06963v2
- Date: Fri, 10 Apr 2020 09:08:41 GMT
- Title: Learning Architectures for Binary Networks
- Authors: Dahyun Kim, Kunal Pratap Singh, Jonghyun Choi
- Abstract summary: Backbone architectures of most binary networks are well-known floating point architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our proposed method searches architectures with stable training curves despite the quantization error inherent in binary networks.
- Score: 10.944100369283483
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Backbone architectures of most binary networks are well-known floating point
architectures such as the ResNet family. Questioning that the architectures
designed for floating point networks would not be the best for binary networks,
we propose to search architectures for binary networks (BNAS) by defining a new
search space for binary architectures and a novel search objective.
Specifically, based on the cell based search method, we define the new search
space of binary layer types, design a new cell template, and rediscover the
utility of and propose to use the Zeroise layer instead of using it as a
placeholder. The novel search objective diversifies early search to learn
better performing binary architectures. We show that our proposed method
searches architectures with stable training curves despite the quantization
error inherent in binary networks. Quantitative analyses demonstrate that our
searched architectures outperform the architectures used in state-of-the-art
binary networks and outperform or perform on par with state-of-the-art binary
networks that employ various techniques other than architectural changes.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - BNAS v2: Learning Architectures for Binary Networks with Empirical
Improvements [11.978082858160576]
Backbone architectures of most binary networks are well-known floating point (FP) architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2021-10-16T12:38:26Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures [21.671696519808226]
Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency.
Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs.
This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space.
arXiv Detail & Related papers (2020-11-21T14:38:44Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - BATS: Binary ArchitecTure Search [56.87581500474093]
We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
arXiv Detail & Related papers (2020-03-03T18:57:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.