Multi-Objective Neural Architecture Search Based on Diverse Structures
and Adaptive Recommendation
- URL: http://arxiv.org/abs/2007.02749v2
- Date: Thu, 13 Aug 2020 15:14:07 GMT
- Title: Multi-Objective Neural Architecture Search Based on Diverse Structures
and Adaptive Recommendation
- Authors: Chunnan Wang, Hongzhi Wang, Guosheng Feng, Fei Geng
- Abstract summary: The search space of neural architecture search (NAS) for convolutional neural network (CNN) is huge.
We propose MoARR algorithm, which utilizes the existing research results and historical information to quickly find architectures that are both lightweight and accurate.
Experimental results show that our MoARR can achieve a powerful and lightweight model (with 1.9% error rate and 2.3M parameters) on CIFAR-10 in 6 GPU hours.
- Score: 4.595675084986132
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The search space of neural architecture search (NAS) for convolutional neural
network (CNN) is huge. To reduce searching cost, most NAS algorithms use fixed
outer network level structure, and search the repeatable cell structure only.
Such kind of fixed architecture performs well when enough cells and channels
are used. However, when the architecture becomes more lightweight, the
performance decreases significantly. To obtain better lightweight
architectures, more flexible and diversified neural architectures are in
demand, and more efficient methods should be designed for larger search space.
Motivated by this, we propose MoARR algorithm, which utilizes the existing
research results and historical information to quickly find architectures that
are both lightweight and accurate. We use the discovered high-performance cells
to construct network architectures. This method increases the network
architecture diversity while also reduces the search space of cell structure
design. In addition, we designs a novel multi-objective method to effectively
analyze the historical evaluation information, so as to efficiently search for
the Pareto optimal architectures with high accuracy and small parameter number.
Experimental results show that our MoARR can achieve a powerful and lightweight
model (with 1.9% error rate and 2.3M parameters) on CIFAR-10 in 6 GPU hours,
which is better than the state-of-the-arts. The explored architecture is
transferable to ImageNet and achieves 76.0% top-1 accuracy with 4.9M
parameters.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Flexible Channel Dimensions for Differentiable Architecture Search [50.33956216274694]
We propose a novel differentiable neural architecture search method with an efficient dynamic channel allocation algorithm.
We show that the proposed framework is able to find DNN architectures that are equivalent to previous methods in task accuracy and inference latency.
arXiv Detail & Related papers (2023-06-13T15:21:38Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Rethinking Architecture Selection in Differentiable NAS [74.61723678821049]
Differentiable Neural Architecture Search is one of the most popular NAS methods for its search efficiency and simplicity.
We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet.
We find that several failure modes of DARTS can be greatly alleviated with the proposed selection method.
arXiv Detail & Related papers (2021-08-10T00:53:39Z) - Evolving Neural Architecture Using One Shot Model [5.188825486231326]
We propose a novel way of applying a simple genetic algorithm to the NAS problem called EvNAS (Evolving Neural Architecture using One Shot Model)
EvNAS searches for the architecture on the proxy dataset i.e. CIFAR-10 for 4.4 GPU day on a single GPU and achieves top-1 test error of 2.47%.
Results show the potential of evolutionary methods in solving the architecture search problem.
arXiv Detail & Related papers (2020-12-23T08:40:53Z) - Disentangled Neural Architecture Search [7.228790381070109]
We propose disentangled neural architecture search (DNAS) which disentangles the hidden representation of the controller into semantically meaningful concepts.
DNAS successfully disentangles the architecture representations, including operation selection, skip connections, and number of layers.
Dense-sampling leads to neural architecture search with higher efficiency and better performance.
arXiv Detail & Related papers (2020-09-24T03:35:41Z) - Off-Policy Reinforcement Learning for Efficient and Effective GAN
Architecture Search [50.40004966087121]
We introduce a new reinforcement learning based neural architecture search (NAS) methodology for generative adversarial network (GAN) architecture search.
The key idea is to formulate the GAN architecture search problem as a Markov decision process (MDP) for smoother architecture sampling.
We exploit an off-policy GAN architecture search algorithm that makes efficient use of the samples generated by previous policies.
arXiv Detail & Related papers (2020-07-17T18:29:17Z) - Fine-Grained Stochastic Architecture Search [6.277767522867666]
Fine-Grained Architecture Search (FiGS) is a differentiable search method that searches over a much larger set of candidate architectures.
FiGS simultaneously selects and modifies operators in the search space by applying a structured sparse regularization penalty.
We show results across 3 existing search spaces, matching or outperforming the original search algorithms.
arXiv Detail & Related papers (2020-06-17T01:04:14Z) - ADWPNAS: Architecture-Driven Weight Prediction for Neural Architecture
Search [6.458169480971417]
We propose an Architecture-Driven Weight Prediction (ADWP) approach for neural architecture search (NAS)
In our approach, we first design an architecture-intensive search space and then train a HyperNetwork by inputting encoding architecture parameters.
Results show that one search procedure can be completed in 4.0 GPU hours on CIFAR-10.
arXiv Detail & Related papers (2020-03-03T05:06:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.