Pareto-wise Ranking Classifier for Multi-objective Evolutionary Neural
Architecture Search
- URL: http://arxiv.org/abs/2109.07582v2
- Date: Sat, 9 Mar 2024 00:01:32 GMT
- Title: Pareto-wise Ranking Classifier for Multi-objective Evolutionary Neural
Architecture Search
- Authors: Lianbo Ma, Nan Li, Guo Yu, Xiaoyu Geng, Min Huang and Xingwei Wang
- Abstract summary: This study focuses on how to find feasible deep models under diverse design objectives.
We propose a classification-wise Pareto evolution approach for one-shot NAS, where an online classifier is trained to predict the dominance relationship between the candidate and constructed reference architectures.
We find a number of neural architectures with different model sizes ranging from 2M to 6M under diverse objectives and constraints.
- Score: 15.454709248397208
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In the deployment of deep neural models, how to effectively and automatically
find feasible deep models under diverse design objectives is fundamental. Most
existing neural architecture search (NAS) methods utilize surrogates to predict
the detailed performance (e.g., accuracy and model size) of a candidate
architecture during the search, which however is complicated and inefficient.
In contrast, we aim to learn an efficient Pareto classifier to simplify the
search process of NAS by transforming the complex multi-objective NAS task into
a simple Pareto-dominance classification task. To this end, we propose a
classification-wise Pareto evolution approach for one-shot NAS, where an online
classifier is trained to predict the dominance relationship between the
candidate and constructed reference architectures, instead of using surrogates
to fit the objective functions. The main contribution of this study is to
change supernet adaption into a Pareto classifier. Besides, we design two
adaptive schemes to select the reference set of architectures for constructing
classification boundary and regulate the rate of positive samples over negative
ones, respectively. We compare the proposed evolution approach with
state-of-the-art approaches on widely-used benchmark datasets, and experimental
results indicate that the proposed approach outperforms other approaches and
have found a number of neural architectures with different model sizes ranging
from 2M to 6M under diverse objectives and constraints.
Related papers
- Informed deep hierarchical classification: a non-standard analysis inspired approach [0.0]
It consists in a multi-output deep neural network equipped with specific projection operators placed before each output layer.
The design of such an architecture, called lexicographic hybrid deep neural network (LH-DNN), has been possible by combining tools from different and quite distant research fields.
To assess the efficacy of the approach, the resulting network is compared against the B-CNN, a convolutional neural network tailored for hierarchical classification tasks.
arXiv Detail & Related papers (2024-09-25T14:12:50Z) - A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Continuous Cartesian Genetic Programming based representation for
Multi-Objective Neural Architecture Search [12.545742558041583]
We propose a novel approach for designing less complex yet highly effective convolutional neural networks (CNNs)
Our approach combines real-based and block-chained CNNs representations based on cartesian genetic programming (CGP) for neural architecture search (NAS)
Two variants are introduced that differ in the granularity of the search space they consider.
arXiv Detail & Related papers (2023-06-05T07:32:47Z) - HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search [104.45426861115972]
We propose to directly generate structural parameters by utilizing the specifically designed hyper kernels.
We obtain three kinds of networks to separately conduct pixel-level or image-level classifications with 1-D or 3-D convolutions.
A series of experiments on six public datasets demonstrate that the proposed methods achieve state-of-the-art results.
arXiv Detail & Related papers (2023-04-23T17:27:40Z) - Surrogate-assisted Multi-objective Neural Architecture Search for
Real-time Semantic Segmentation [11.866947846619064]
neural architecture search (NAS) has emerged as a promising avenue toward automating the design of architectures.
We propose a surrogate-assisted multi-objective method to address the challenges of applying NAS to semantic segmentation.
Our method can identify architectures significantly outperforming existing state-of-the-art architectures designed both manually by human experts and automatically by other NAS methods.
arXiv Detail & Related papers (2022-08-14T10:18:51Z) - AutoBERT-Zero: Evolving BERT Backbone from Scratch [94.89102524181986]
We propose an Operation-Priority Neural Architecture Search (OP-NAS) algorithm to automatically search for promising hybrid backbone architectures.
We optimize both the search algorithm and evaluation of candidate models to boost the efficiency of our proposed OP-NAS.
Experiments show that the searched architecture (named AutoBERT-Zero) significantly outperforms BERT and its variants of different model capacities in various downstream tasks.
arXiv Detail & Related papers (2021-07-15T16:46:01Z) - AutoAdapt: Automated Segmentation Network Search for Unsupervised Domain
Adaptation [4.793219747021116]
We perform neural architecture search (NAS) to provide architecture-level perspective and analysis for domain adaptation.
We propose bridging this gap by using maximum mean discrepancy and regional weighted entropy to estimate the accuracy metric.
arXiv Detail & Related papers (2021-06-24T17:59:02Z) - Redefining Neural Architecture Search of Heterogeneous Multi-Network
Models by Characterizing Variation Operators and Model Components [71.03032589756434]
We investigate the effect of different variation operators in a complex domain, that of multi-network heterogeneous neural models.
We characterize both the variation operators, according to their effect on the complexity and performance of the model; and the models, relying on diverse metrics which estimate the quality of the different parts composing it.
arXiv Detail & Related papers (2021-06-16T17:12:26Z) - Network Architecture Search for Domain Adaptation [11.24426822697648]
We present Neural Architecture Search for Domain Adaptation (NASDA), a principle framework that leverages differentiable neural architecture search to derive the optimal network architecture for domain adaptation task.
We demonstrate experimentally that NASDA leads to state-of-the-art performance on several domain adaptation benchmarks.
arXiv Detail & Related papers (2020-08-13T06:15:57Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.