Neural Architecture Search with an Efficient Multiobjective Evolutionary
Framework
- URL: http://arxiv.org/abs/2011.04463v1
- Date: Mon, 9 Nov 2020 14:41:10 GMT
- Title: Neural Architecture Search with an Efficient Multiobjective Evolutionary
Framework
- Authors: Maria Baldeon Calisto and Susana Lai-Yuen
- Abstract summary: We propose EMONAS, an Efficient MultiObjective Neural Architecture Search framework.
EMONAS is composed of a search space that considers both the macro- and micro-structure of the architecture.
It is evaluated on the task of 3D cardiac segmentation from the MICCAI ACDC challenge.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning methods have become very successful at solving many complex
tasks such as image classification and segmentation, speech recognition and
machine translation. Nevertheless, manually designing a neural network for a
specific problem is very difficult and time-consuming due to the massive
hyperparameter search space, long training times, and lack of technical
guidelines for the hyperparameter selection. Moreover, most networks are highly
complex, task specific and over-parametrized. Recently, multiobjective neural
architecture search (NAS) methods have been proposed to automate the design of
accurate and efficient architectures. However, they only optimize either the
macro- or micro-structure of the architecture requiring the unset
hyperparameters to be manually defined, and do not use the information produced
during the optimization process to increase the efficiency of the search. In
this work, we propose EMONAS, an Efficient MultiObjective Neural Architecture
Search framework for the automatic design of neural architectures while
optimizing the network's accuracy and size. EMONAS is composed of a search
space that considers both the macro- and micro-structure of the architecture,
and a surrogate-assisted multiobjective evolutionary based algorithm that
efficiently searches for the best hyperparameters using a Random Forest
surrogate and guiding selection probabilities. EMONAS is evaluated on the task
of 3D cardiac segmentation from the MICCAI ACDC challenge, which is crucial for
disease diagnosis, risk evaluation, and therapy decision. The architecture
found with EMONAS is ranked within the top 10 submissions of the challenge in
all evaluation metrics, performing better or comparable to other approaches
while reducing the search time by more than 50% and having considerably fewer
number of parameters.
Related papers
- A Survey on Neural Architecture Search Based on Reinforcement Learning [0.0]
This paper introduces the overall development of Neural Architecture Search.
We then focus mainly on providing an overall and understandable survey about Neural Architecture Search works.
arXiv Detail & Related papers (2024-09-26T17:28:10Z) - EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Multi-Objective Neural Architecture Search for In-Memory Computing [0.5892638927736115]
We employ neural architecture search (NAS) to enhance the efficiency of deploying diverse machine learning (ML) tasks on in-memory computing architectures.
Our evaluation of this NAS approach for IMC architecture deployment spans three distinct image classification datasets.
arXiv Detail & Related papers (2024-06-10T19:17:09Z) - Surrogate-assisted Multi-objective Neural Architecture Search for
Real-time Semantic Segmentation [11.866947846619064]
neural architecture search (NAS) has emerged as a promising avenue toward automating the design of architectures.
We propose a surrogate-assisted multi-objective method to address the challenges of applying NAS to semantic segmentation.
Our method can identify architectures significantly outperforming existing state-of-the-art architectures designed both manually by human experts and automatically by other NAS methods.
arXiv Detail & Related papers (2022-08-14T10:18:51Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - Rethinking Performance Estimation in Neural Architecture Search [191.08960589460173]
We provide a novel yet systematic rethinking of performance estimation (PE) in a resource constrained regime.
By combining BPE with various search algorithms including reinforcement learning, evolution algorithm, random search, and differentiable architecture search, we achieve 1, 000x of NAS speed up with a negligible performance drop.
arXiv Detail & Related papers (2020-05-20T09:01:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.