Stacked BNAS: Rethinking Broad Convolutional Neural Network for Neural
Architecture Search
- URL: http://arxiv.org/abs/2111.07722v1
- Date: Mon, 15 Nov 2021 12:49:27 GMT
- Title: Stacked BNAS: Rethinking Broad Convolutional Neural Network for Neural
Architecture Search
- Authors: Zixiang Ding, Yaran Chen, Nannan Li, Dongbin Zhao
- Abstract summary: We propose Stacked BNAS whose search space is a developed broad scalable architecture named Stacked BCNN, with better performance than BNAS.
On the one hand, Stacked BCNN treats mini-BCNN as the basic block to preserve comprehensive representation and deliver powerful feature extraction ability.
On the other hand, we propose Knowledge Embedding Search (KES) to learn appropriate knowledge embeddings.
- Score: 16.6035648938434
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Different from other deep scalable architecture based NAS approaches, Broad
Neural Architecture Search (BNAS) proposes a broad one which consists of
convolution and enhancement blocks, dubbed Broad Convolutional Neural Network
(BCNN) as search space for amazing efficiency improvement. BCNN reuses the
topologies of cells in convolution block, so that BNAS can employ few cells for
efficient search. Moreover, multi-scale feature fusion and knowledge embedding
are proposed to improve the performance of BCNN with shallow topology. However,
BNAS suffers some drawbacks: 1) insufficient representation diversity for
feature fusion and enhancement, and 2) time consuming of knowledge embedding
design by human expert.
In this paper, we propose Stacked BNAS whose search space is a developed
broad scalable architecture named Stacked BCNN, with better performance than
BNAS. On the one hand, Stacked BCNN treats mini-BCNN as the basic block to
preserve comprehensive representation and deliver powerful feature extraction
ability. On the other hand, we propose Knowledge Embedding Search (KES) to
learn appropriate knowledge embeddings. Experimental results show that 1)
Stacked BNAS obtains better performance than BNAS, 2) KES contributes to reduce
the parameters of learned architecture with satisfactory performance, and 3)
Stacked BNAS delivers state-of-the-art efficiency of 0.02 GPU days.
Related papers
- NAS-BNN: Neural Architecture Search for Binary Neural Networks [55.058512316210056]
We propose a novel neural architecture search scheme for binary neural networks, named NAS-BNN.
Our discovered binary model family outperforms previous BNNs for a wide range of operations (OPs) from 20M to 200M.
In addition, we validate the transferability of these searched BNNs on the object detection task, and our binary detectors with the searched BNNs achieve a novel state-of-the-art result, e.g., 31.6% mAP with 370M OPs, on MS dataset.
arXiv Detail & Related papers (2024-08-28T02:17:58Z) - When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search [117.89827740405694]
Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
arXiv Detail & Related papers (2022-04-11T07:34:21Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - BARS: Joint Search of Cell Topology and Layout for Accurate and
Efficient Binary ARchitectures [21.671696519808226]
Binary Neural Networks (BNNs) have received significant attention due to their promising efficiency.
Currently, most BNN studies directly adopt widely-used CNN architectures, which can be suboptimal for BNNs.
This paper proposes a novel Binary ARchitecture Search (BARS) flow to discover superior binary architecture in a large design space.
arXiv Detail & Related papers (2020-11-21T14:38:44Z) - BNAS-v2: Memory-efficient and Performance-collapse-prevented Broad
Neural Architecture Search [15.287692867984228]
BNAS-v2 embodying both superiorities of BCNN simultaneously.
continuous relaxation strategy to make each edge of cell relevant to all candidate operations.
Combination of partial channel connections and edge normalization can improve the memory efficiency further.
arXiv Detail & Related papers (2020-09-18T15:25:08Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - BATS: Binary ArchitecTure Search [56.87581500474093]
We show that directly applying Neural Architecture Search to the binary domain provides very poor results.
Specifically, we introduce and design a novel binary-oriented search space.
We also set a new state-of-the-art for binary neural networks on CIFAR10, CIFAR100 and ImageNet datasets.
arXiv Detail & Related papers (2020-03-03T18:57:02Z) - BNAS:An Efficient Neural Architecture Search Approach Using Broad
Scalable Architecture [62.587982139871976]
We propose Broad Neural Architecture Search (BNAS) where we elaborately design broad scalable architecture dubbed Broad Convolutional Neural Network (BCNN)
BNAS delivers 0.19 days which is 2.37x less expensive than ENAS who ranks the best in reinforcement learning-based NAS approaches.
arXiv Detail & Related papers (2020-01-18T15:07:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.