TS-ENAS:Two-Stage Evolution for Cell-based Network Architecture Search
- URL: http://arxiv.org/abs/2310.09525v1
- Date: Sat, 14 Oct 2023 08:02:01 GMT
- Title: TS-ENAS:Two-Stage Evolution for Cell-based Network Architecture Search
- Authors: Juan Zou, Shenghong Wu, Yizhang Xia, Weiwei Jiang, Zeping Wu, Jinhua
Zheng
- Abstract summary: We propose a Two-Stage Evolution for cell-based Network Architecture Search (TS-ENAS)
In our algorithm, a new cell-based search space and an effective two-stage encoding method are designed to represent cells and neural network structures.
The experimental results show that TS-ENAS can more effectively find the neural network architecture with comparative performance.
- Score: 3.267963071384687
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural network architecture search provides a solution to the automatic
design of network structures. However, it is difficult to search the whole
network architecture directly. Although using stacked cells to search neural
network architectures is an effective way to reduce the complexity of
searching, these methods do not able find the global optimal neural network
structure since the number of layers, cells and connection methods is fixed. In
this paper, we propose a Two-Stage Evolution for cell-based Network
Architecture Search(TS-ENAS), including one-stage searching based on stacked
cells and second-stage adjusting these cells. In our algorithm, a new
cell-based search space and an effective two-stage encoding method are designed
to represent cells and neural network structures. In addition, a cell-based
weight inheritance strategy is designed to initialize the weight of the
network, which significantly reduces the running time of the algorithm. The
proposed methods are extensively tested and compared on four image
classification dataset, Fashion-MNIST, CIFAR10, CIFAR100 and ImageNet and
compared with 22 state-of-the-art algorithms including hand-designed networks
and NAS networks. The experimental results show that TS-ENAS can more
effectively find the neural network architecture with comparative performance.
Related papers
- G-EvoNAS: Evolutionary Neural Architecture Search Based on Network
Growth [6.712149832731174]
This paper proposes a computationally efficient neural architecture evolutionary search framework based on network growth (G-EvoNAS)
The G-EvoNAS is tested on three commonly used image classification datasets, CIFAR10, CIFAR100, and ImageNet.
Experimental results demonstrate that G-EvoNAS can find a neural network architecture comparable to state-of-the-art designs in 0.2 GPU days.
arXiv Detail & Related papers (2024-03-05T05:44:38Z) - NAS-ASDet: An Adaptive Design Method for Surface Defect Detection
Network using Neural Architecture Search [5.640706784987607]
We propose a new method called NAS-ASDet to adaptively design network for surface defect detection.
First, a refined and industry-appropriate search space that can adaptively adjust the feature distribution is designed.
Then, a progressive search strategy with a deep supervision mechanism is used to explore the search space faster and better.
arXiv Detail & Related papers (2023-11-18T03:15:45Z) - HASA: Hybrid Architecture Search with Aggregation Strategy for
Echinococcosis Classification and Ovary Segmentation in Ultrasound Images [0.0]
We propose a hybrid NAS framework for ultrasound (US) image classification and segmentation.
Our method can generate more powerful and lightweight models for the above US image classification and segmentation tasks.
arXiv Detail & Related papers (2022-04-14T01:43:00Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - Continuous Ant-Based Neural Topology Search [62.200941836913586]
This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization.
The Continuous Ant-based Neural Topology Search (CANTS) is strongly inspired by how ants move in the real world.
arXiv Detail & Related papers (2020-11-21T17:49:44Z) - SAR-NAS: Skeleton-based Action Recognition via Neural Architecture
Searching [18.860051578038608]
We encode a skeleton-based action instance into a tensor and define a set of operations to build two types of network cells: normal cells and reduction cells.
Experiments on the challenging NTU RGB+D and Kinectics datasets have verified that most of the networks developed to date for skeleton-based action recognition are likely not compact and efficient.
The proposed method provides an approach to search for such a compact network that is able to achieve comparative or even better performance than the state-of-the-art methods.
arXiv Detail & Related papers (2020-10-29T03:24:15Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.