A Survey on Multi-Objective Neural Architecture Search
- URL: http://arxiv.org/abs/2307.09099v1
- Date: Tue, 18 Jul 2023 09:42:51 GMT
- Title: A Survey on Multi-Objective Neural Architecture Search
- Authors: Seyed Mahdi Shariatzadeh, Mahmood Fathy, Reza Berangi, Mohammad
Shahverdy
- Abstract summary: Multi-Objective Neural architecture Search (MONAS) has been attracting attentions.
We present an overview of principal and state-of-the-art works in the field of MONAS.
- Score: 9.176056742068813
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, the expert-crafted neural architectures is increasing overtaken by
the utilization of neural architecture search (NAS) and automatic generation
(and tuning) of network structures which has a close relation to the
Hyperparameter Optimization and Auto Machine Learning (AutoML). After the
earlier NAS attempts to optimize only the prediction accuracy, Multi-Objective
Neural architecture Search (MONAS) has been attracting attentions which
considers more goals such as computational complexity, power consumption, and
size of the network for optimization, reaching a trade-off between the accuracy
and other features like the computational cost. In this paper, we present an
overview of principal and state-of-the-art works in the field of MONAS.
Starting from a well-categorized taxonomy and formulation for the NAS, we
address and correct some miscategorizations in previous surveys of the NAS
field. We also provide a list of all known objectives used and add a number of
new ones and elaborate their specifications. We have provides analyses about
the most important objectives and shown that the stochastic properties of some
the them should be differed from deterministic ones in the multi-objective
optimization procedure of NAS. We finalize this paper with a number of future
directions and topics in the field of MONAS.
Related papers
- Efficient Multi-Objective Neural Architecture Search via Pareto Dominance-based Novelty Search [0.0]
Neural Architecture Search (NAS) aims to automate the discovery of high-performing deep neural network architectures.
Traditional NAS approaches typically optimize a certain performance metric (e.g., prediction accuracy) overlooking large parts of the architecture search space that potentially contain interesting network configurations.
This paper presents a novelty search for multi-objective NAS with Multiple Training-Free metrics (MTF-PDNS)
arXiv Detail & Related papers (2024-07-30T08:52:10Z) - A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Multi-objective Differentiable Neural Architecture Search [58.67218773054753]
We propose a novel NAS algorithm that encodes user preferences for the trade-off between performance and hardware metrics.
Our method outperforms existing MOO NAS methods across a broad range of qualitatively different search spaces and datasets.
arXiv Detail & Related papers (2024-02-28T10:09:04Z) - Surrogate-assisted Multi-objective Neural Architecture Search for
Real-time Semantic Segmentation [11.866947846619064]
neural architecture search (NAS) has emerged as a promising avenue toward automating the design of architectures.
We propose a surrogate-assisted multi-objective method to address the challenges of applying NAS to semantic segmentation.
Our method can identify architectures significantly outperforming existing state-of-the-art architectures designed both manually by human experts and automatically by other NAS methods.
arXiv Detail & Related papers (2022-08-14T10:18:51Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - Differentiable NAS Framework and Application to Ads CTR Prediction [30.74403362212425]
We implement an inference and modular framework for Differentiable Neural Architecture Search (DNAS)
We apply DNAS to the problem of ads click-through rate (CTR) prediction, arguably the highest-value and most worked on AI problem at hyperscalers today.
We develop and tailor novel search spaces to a Deep Learning Recommendation Model (DLRM) backbone for CTR prediction, and report state-of-the-art results on the Criteo Kaggle CTR prediction dataset.
arXiv Detail & Related papers (2021-10-25T05:46:27Z) - Bag of Baselines for Multi-objective Joint Neural Architecture Search
and Hyperparameter Optimization [29.80410614305851]
Neural architecture search (NAS) and hyper parameter optimization (HPO) make deep learning accessible to non-experts.
We propose a set of methods that extend current approaches to jointly optimize neural architectures and hyper parameters with respect to multiple objectives.
These methods will serve as simple baselines for future research on multi-objective joint NAS + HPO.
arXiv Detail & Related papers (2021-05-03T17:04:56Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Count: Counting-by-Density with Neural Architecture Search [74.92941571724525]
We automate the design of counting models with Neural Architecture Search (NAS)
We introduce an end-to-end searched encoder-decoder architecture, Automatic Multi-Scale Network (AMSNet)
arXiv Detail & Related papers (2020-02-29T09:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.