POPNASv2: An Efficient Multi-Objective Neural Architecture Search
Technique
- URL: http://arxiv.org/abs/2210.02959v1
- Date: Thu, 6 Oct 2022 14:51:54 GMT
- Title: POPNASv2: An Efficient Multi-Objective Neural Architecture Search
Technique
- Authors: Andrea Falanti, Eugenio Lomurno, Stefano Samele, Danilo Ardagna,
Matteo Matteucci
- Abstract summary: This paper proposes a new version of the Pareto-optimal Progressive Neural Architecture Search, called POPNASv2.
Our approach enhances its first version and improves its performance.
Our efforts allow POPNASv2 to achieve PNAS-like performance with an average 4x factor search time speed-up.
- Score: 7.497722345725035
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Automating the research for the best neural network model is a task that has
gained more and more relevance in the last few years. In this context, Neural
Architecture Search (NAS) represents the most effective technique whose results
rival the state of the art hand-crafted architectures. However, this approach
requires a lot of computational capabilities as well as research time, which
makes prohibitive its usage in many real-world scenarios. With its sequential
model-based optimization strategy, Progressive Neural Architecture Search
(PNAS) represents a possible step forward to face this resources issue. Despite
the quality of the found network architectures, this technique is still limited
in research time. A significant step in this direction has been done by
Pareto-Optimal Progressive Neural Architecture Search (POPNAS), which expands
PNAS with a time predictor to enable a trade-off between search time and
accuracy, considering a multi-objective optimization problem. This paper
proposes a new version of the Pareto-Optimal Progressive Neural Architecture
Search, called POPNASv2. Our approach enhances its first version and improves
its performance. We expanded the search space by adding new operators and
improved the quality of both predictors to build more accurate Pareto fronts.
Moreover, we introduced cell equivalence checks and enriched the search
strategy with an adaptive greedy exploration step. Our efforts allow POPNASv2
to achieve PNAS-like performance with an average 4x factor search time
speed-up.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - POPNASv3: a Pareto-Optimal Neural Architecture Search Solution for Image
and Time Series Classification [8.190723030003804]
This article presents the third version of a sequential model-based NAS algorithm targeting different hardware environments and multiple classification tasks.
Our method is able to find competitive architectures within large search spaces, while keeping a flexible structure and data processing pipeline to adapt to different tasks.
The experiments performed on images and time series classification datasets provide evidence that POPNASv3 can explore a large set of assorted operators and converge to optimal architectures suited for the type of data provided under different scenarios.
arXiv Detail & Related papers (2022-12-13T17:14:14Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - FNAS: Uncertainty-Aware Fast Neural Architecture Search [54.49650267859032]
Reinforcement learning (RL)-based neural architecture search (NAS) generally guarantees better convergence yet suffers from the requirement of huge computational resources.
We propose a general pipeline to accelerate the convergence of the rollout process as well as the RL process in NAS.
Experiments on the Mobile Neural Architecture Search (MNAS) search space show the proposed Fast Neural Architecture Search (FNAS) accelerates standard RL-based NAS process by 10x.
arXiv Detail & Related papers (2021-05-25T06:32:52Z) - ViPNAS: Efficient Video Pose Estimation via Neural Architecture Search [94.90294600817215]
We propose a novel neural architecture search (NAS) method, termed ViPNAS, to search networks in both spatial and temporal levels for fast online video pose estimation.
In the spatial level, we carefully design the search space with five different dimensions including network depth, width, kernel size, group number, and attentions.
In the temporal level, we search from a series of temporal feature fusions to optimize the total accuracy and speed across multiple video frames.
arXiv Detail & Related papers (2021-05-21T06:36:40Z) - Efficient Model Performance Estimation via Feature Histories [27.008927077173553]
An important step in the task of neural network design is the evaluation of a model's performance.
In this work, we use the evolution history of features of a network during the early stages of training to build a proxy classifier.
We show that our method can be combined with multiple search algorithms to find better solutions to a wide range of tasks.
arXiv Detail & Related papers (2021-03-07T20:41:57Z) - Effective, Efficient and Robust Neural Architecture Search [4.273005643715522]
Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS)
We propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration.
Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
arXiv Detail & Related papers (2020-11-19T13:46:23Z) - PV-NAS: Practical Neural Architecture Search for Video Recognition [83.77236063613579]
Deep neural networks for video tasks is highly customized and the design of such networks requires domain experts and costly trial and error tests.
Recent advance in network architecture search has boosted the image recognition performance in a large margin.
In this study, we propose a practical solution, namely Practical Video Neural Architecture Search (PV-NAS)
arXiv Detail & Related papers (2020-11-02T08:50:23Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Progressive Automatic Design of Search Space for One-Shot Neural
Architecture Search [15.017964136568061]
It has been observed that a model with higher one-shot model accuracy does not necessarily perform better when stand-alone trained.
We propose Progressive Automatic Design of search space, named PAD-NAS.
In this way, PAD-NAS can automatically design the operations for each layer and achieve a trade-off between search space quality and model diversity.
arXiv Detail & Related papers (2020-05-15T14:21:07Z) - Neural Architecture Generator Optimization [9.082931889304723]
We are first to investigate casting NAS as a problem of finding the optimal network generator.
We propose a new, hierarchical and graph-based search space capable of representing an extremely large variety of network types.
arXiv Detail & Related papers (2020-04-03T06:38:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.