Efficient Search of Comprehensively Robust Neural Architectures via
Multi-fidelity Evaluation
- URL: http://arxiv.org/abs/2305.07308v1
- Date: Fri, 12 May 2023 08:28:58 GMT
- Title: Efficient Search of Comprehensively Robust Neural Architectures via
Multi-fidelity Evaluation
- Authors: Jialiang Sun, Wen Yao, Tingsong Jiang, Xiaoqian Chen
- Abstract summary: We propose a novel efficient search of comprehensively robust neural architectures via multi-fidelity evaluation (ES-CRNA-ME)
Specifically, we first search for comprehensively robust architectures under multiple types of evaluations using the weight-sharing-based NAS method.
We reduce the number of robustness evaluations by the correlation analysis, which can incorporate similar evaluations and decrease the evaluation cost.
- Score: 1.9100854225243937
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Neural architecture search (NAS) has emerged as one successful technique to
find robust deep neural network (DNN) architectures. However, most existing
robustness evaluations in NAS only consider $l_{\infty}$ norm-based adversarial
noises. In order to improve the robustness of DNN models against multiple types
of noises, it is necessary to consider a comprehensive evaluation in NAS for
robust architectures. But with the increasing number of types of robustness
evaluations, it also becomes more time-consuming to find comprehensively robust
architectures. To alleviate this problem, we propose a novel efficient search
of comprehensively robust neural architectures via multi-fidelity evaluation
(ES-CRNA-ME). Specifically, we first search for comprehensively robust
architectures under multiple types of evaluations using the
weight-sharing-based NAS method, including different $l_{p}$ norm attacks,
semantic adversarial attacks, and composite adversarial attacks. In addition,
we reduce the number of robustness evaluations by the correlation analysis,
which can incorporate similar evaluations and decrease the evaluation cost.
Finally, we propose a multi-fidelity online surrogate during optimization to
further decrease the search cost. On the basis of the surrogate constructed by
low-fidelity data, the online high-fidelity data is utilized to finetune the
surrogate. Experiments on CIFAR10 and CIFAR100 datasets show the effectiveness
of our proposed method.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Generalizable Lightweight Proxy for Robust NAS against Diverse
Perturbations [59.683234126055694]
Recent neural architecture search (NAS) frameworks have been successful in finding optimal architectures for given conditions.
We propose a novel lightweight robust zero-cost proxy that considers the consistency across features, parameters, and gradients of both clean and perturbed images.
Our approach facilitates an efficient and rapid search for neural architectures capable of learning generalizable features that exhibit robustness across diverse perturbations.
arXiv Detail & Related papers (2023-06-08T08:34:26Z) - Bi-fidelity Evolutionary Multiobjective Search for Adversarially Robust
Deep Neural Architectures [19.173285459139592]
This paper proposes a bi-fidelity multiobjective neural architecture search approach.
In addition to a low-fidelity performance predictor, we leverage an auxiliary-objective -- the value of which is the output of a surrogate model trained with high-fidelity evaluations.
The effectiveness of the proposed approach is confirmed by extensive experiments conducted on CIFAR-10, CIFAR-100 and SVHN datasets.
arXiv Detail & Related papers (2022-07-12T05:26:09Z) - Searching for Robust Neural Architectures via Comprehensive and Reliable
Evaluation [6.612134996737988]
We propose a novel framework, called Auto Adversarial Attack and Defense (AAAD), where we employ neural architecture search methods.
We consider four types of robustness evaluations, including adversarial noise, natural noise, system noise and quantified metrics.
The empirical results on the CIFAR10 dataset show that the searched efficient attack could help find more robust architectures.
arXiv Detail & Related papers (2022-03-07T04:45:05Z) - $\beta$-DARTS: Beta-Decay Regularization for Differentiable Architecture
Search [85.84110365657455]
We propose a simple-but-efficient regularization method, termed as Beta-Decay, to regularize the DARTS-based NAS searching process.
Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets.
arXiv Detail & Related papers (2022-03-03T11:47:14Z) - Accelerating Evolutionary Neural Architecture Search via Multi-Fidelity
Evaluation [5.754705118117044]
We propose an accelerated ENAS via multifidelity evaluation termed MFENAS.
MFENAS achieves a 2.39% test error rate at the cost of only 0.6 GPU days on one NVIDIA 2080TI GPU.
Results on CIFAR-10 show that the architecture obtained by the proposed MFENAS achieves a 2.39% test error rate at the cost of only 0.6 GPU days.
arXiv Detail & Related papers (2021-08-10T09:32:26Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - DSRNA: Differentiable Search of Robust Neural Architectures [11.232234265070753]
In deep learning applications, the architectures of deep neural networks are crucial in achieving high accuracy.
We propose methods to perform differentiable search of robust neural architectures.
Our methods are more robust to various norm-bound attacks than several robust NAS baselines.
arXiv Detail & Related papers (2020-12-11T04:52:54Z) - Effective, Efficient and Robust Neural Architecture Search [4.273005643715522]
Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS)
We propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration.
Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
arXiv Detail & Related papers (2020-11-19T13:46:23Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.