G-EvoNAS: Evolutionary Neural Architecture Search Based on Network
Growth
- URL: http://arxiv.org/abs/2403.02667v1
- Date: Tue, 5 Mar 2024 05:44:38 GMT
- Title: G-EvoNAS: Evolutionary Neural Architecture Search Based on Network
Growth
- Authors: Juan Zou, Weiwei Jiang, Yizhang Xia, Yuan Liu, Zhanglu Hou
- Abstract summary: This paper proposes a computationally efficient neural architecture evolutionary search framework based on network growth (G-EvoNAS)
The G-EvoNAS is tested on three commonly used image classification datasets, CIFAR10, CIFAR100, and ImageNet.
Experimental results demonstrate that G-EvoNAS can find a neural network architecture comparable to state-of-the-art designs in 0.2 GPU days.
- Score: 6.712149832731174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The evolutionary paradigm has been successfully applied to neural network
search(NAS) in recent years. Due to the vast search complexity of the global
space, current research mainly seeks to repeatedly stack partial architectures
to build the entire model or to seek the entire model based on manually
designed benchmark modules. The above two methods are attempts to reduce the
search difficulty by narrowing the search space. To efficiently search network
architecture in the global space, this paper proposes another solution, namely
a computationally efficient neural architecture evolutionary search framework
based on network growth (G-EvoNAS). The complete network is obtained by
gradually deepening different Blocks. The process begins from a shallow
network, grows and evolves, and gradually deepens into a complete network,
reducing the search complexity in the global space. Then, to improve the
ranking accuracy of the network, we reduce the weight coupling of each network
in the SuperNet by pruning the SuperNet according to elite groups at different
growth stages. The G-EvoNAS is tested on three commonly used image
classification datasets, CIFAR10, CIFAR100, and ImageNet, and compared with
various state-of-the-art algorithms, including hand-designed networks and NAS
networks. Experimental results demonstrate that G-EvoNAS can find a neural
network architecture comparable to state-of-the-art designs in 0.2 GPU days.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Designing deep neural networks for driver intention recognition [40.87622566719826]
This paper applies neural architecture search to investigate the effects of the deep neural network architecture on a real-world safety critical application.
A set of eight search strategies are evaluated for two driver intention recognition datasets.
arXiv Detail & Related papers (2024-02-07T12:54:15Z) - TS-ENAS:Two-Stage Evolution for Cell-based Network Architecture Search [3.267963071384687]
We propose a Two-Stage Evolution for cell-based Network Architecture Search (TS-ENAS)
In our algorithm, a new cell-based search space and an effective two-stage encoding method are designed to represent cells and neural network structures.
The experimental results show that TS-ENAS can more effectively find the neural network architecture with comparative performance.
arXiv Detail & Related papers (2023-10-14T08:02:01Z) - GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - Evolutionary Neural Cascade Search across Supernetworks [68.8204255655161]
We introduce ENCAS - Evolutionary Neural Cascade Search.
ENCAS can be used to search over multiple pretrained supernetworks.
We test ENCAS on common computer vision benchmarks.
arXiv Detail & Related papers (2022-03-08T11:06:01Z) - Neural Architecture Search for Image Super-Resolution Using Densely
Constructed Search Space: DeCoNAS [18.191710317555952]
We use neural architecture search (NAS) methods to find a lightweight densely connected network named DeCoNASNet.
We define a complexity-based penalty for solving image super-resolution, which can be considered a multi-objective problem.
Experiments show that our DeCoNASNet outperforms the state-of-the-art lightweight super-resolution networks designed by handcraft methods and existing NAS-based design.
arXiv Detail & Related papers (2021-04-19T04:51:16Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.