BATS: Binary ArchitecTure Search
- URL: http://arxiv.org/abs/2003.01711v2
- Date: Thu, 23 Jul 2020 21:57:18 GMT
- Title: BATS: Binary ArchitecTure Search
- Authors: Adrian Bulat and Brais Martinez and Georgios Tzimiropoulos
- Abstract summary: We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
- Score: 56.87581500474093
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper proposes Binary ArchitecTure Search (BATS), a framework that
drastically reduces the accuracy gap between binary neural networks and their
real-valued counterparts by means of Neural Architecture Search (NAS). We show
that directly applying NAS to the binary domain provides very poor results. To
alleviate this, we describe, to our knowledge, for the first time, the 3 key
ingredients for successfully applying NAS to the binary domain. Specifically,
we (1) introduce and design a novel binary-oriented search space, (2) propose a
new mechanism for controlling and stabilising the resulting searched
topologies, (3) propose and validate a series of new search strategies for
binary networks that lead to faster convergence and lower search times.
Experimental results demonstrate the effectiveness of the proposed approach and
the necessity of searching in the binary space directly. Moreover, (4) we set a
new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and
ImageNet datasets. Code will be made available
https://github.com/1adrianb/binary-nas
Related papers
- DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search [117.89827740405694]
Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
arXiv Detail & Related papers (2022-04-11T07:34:21Z) - BNAS v2: Learning Architectures for Binary Networks with Empirical
Improvements [11.978082858160576]
Backbone architectures of most binary networks are well-known floating point (FP) architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2021-10-16T12:38:26Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - OPANAS: One-Shot Path Aggregation Network Architecture Search for Object
Detection [82.04372532783931]
Recently, neural architecture search (NAS) has been exploited to design feature pyramid networks (FPNs)
We propose a novel One-Shot Path Aggregation Network Architecture Search (OPANAS) algorithm, which significantly improves both searching efficiency and detection accuracy.
arXiv Detail & Related papers (2021-03-08T01:48:53Z) - Neural Architecture Search on ImageNet in Four GPU Hours: A
Theoretically Inspired Perspective [88.39981851247727]
We propose a novel framework called training-free neural architecture search (TE-NAS)
TE-NAS ranks architectures by analyzing the spectrum of the neural tangent kernel (NTK) and the number of linear regions in the input space.
We show that: (1) these two measurements imply the trainability and expressivity of a neural network; (2) they strongly correlate with the network's test accuracy.
arXiv Detail & Related papers (2021-02-23T07:50:44Z) - BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures [21.671696519808226]
Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency.
Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs.
This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space.
arXiv Detail & Related papers (2020-11-21T14:38:44Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - Local Search is a Remarkably Strong Baseline for Neural Architecture
Search [0.0]
We consider, for the first time, a simple Local Search (LS) algorithm for Neural Architecture Search (NAS)
We release two benchmark datasets, named MacroNAS-C10 and MacroNAS-C100, containing 200K saved network evaluations for two established image classification tasks.
arXiv Detail & Related papers (2020-04-20T00:08:34Z) - Learning Architectures for Binary Networks [10.944100369283483]
Backbone architectures of most binary networks are well-known floating point architectures such as the ResNet family.
We propose to search architectures for binary networks by defining a new search space for binary architectures and a novel search objective.
We show that our proposed method searches architectures with stable training curves despite the quantization error inherent in binary networks.
arXiv Detail & Related papers (2020-02-17T14:06:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.