How Much Is Hidden in the NAS Benchmarks? Few-Shot Adaptation of a NAS
Predictor
- URL: http://arxiv.org/abs/2311.18451v1
- Date: Thu, 30 Nov 2023 10:51:46 GMT
- Title: How Much Is Hidden in the NAS Benchmarks? Few-Shot Adaptation of a NAS
Predictor
- Authors: Hrushikesh Loya, {\L}ukasz Dudziak, Abhinav Mehrotra, Royson Lee,
Javier Fernandez-Marques, Nicholas D. Lane, Hongkai Wen
- Abstract summary: We borrow from the rich field of meta-learning for few-shot adaptation and study applicability of those methods to NAS.
Our meta-learning approach not only shows superior (or matching) performance in the cross-validation experiments but also successful extrapolation to a new search space and tasks.
- Score: 22.87207410692821
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Neural architecture search has proven to be a powerful approach to designing
and refining neural networks, often boosting their performance and efficiency
over manually-designed variations, but comes with computational overhead. While
there has been a considerable amount of research focused on lowering the cost
of NAS for mainstream tasks, such as image classification, a lot of those
improvements stem from the fact that those tasks are well-studied in the
broader context. Consequently, applicability of NAS to emerging and
under-represented domains is still associated with a relatively high cost
and/or uncertainty about the achievable gains. To address this issue, we turn
our focus towards the recent growth of publicly available NAS benchmarks in an
attempt to extract general NAS knowledge, transferable across different tasks
and search spaces. We borrow from the rich field of meta-learning for few-shot
adaptation and carefully study applicability of those methods to NAS, with a
special focus on the relationship between task-level correlation (domain shift)
and predictor transferability; which we deem critical for improving NAS on
diverse tasks. In our experiments, we use 6 NAS benchmarks in conjunction,
spanning in total 16 NAS settings -- our meta-learning approach not only shows
superior (or matching) performance in the cross-validation experiments but also
successful extrapolation to a new search space and tasks.
Related papers
- Robustifying and Boosting Training-Free Neural Architecture Search [49.828875134088904]
We propose a robustifying and boosting training-free NAS (RoBoT) algorithm to develop a robust and consistently better-performing metric on diverse tasks.
Remarkably, the expected performance of our RoBoT can be theoretically guaranteed, which improves over the existing training-free NAS.
arXiv Detail & Related papers (2024-03-12T12:24:11Z) - Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets [55.2118691522524]
Distillation-aware Neural Architecture Search (DaNAS) aims to search for an optimal student architecture.
We propose a distillation-aware meta accuracy prediction model, DaSS (Distillation-aware Student Search), which can predict a given architecture's final performances on a dataset.
arXiv Detail & Related papers (2023-05-26T14:00:35Z) - Generalization Properties of NAS under Activation and Skip Connection
Search [66.8386847112332]
We study the generalization properties of Neural Architecture Search (NAS) under a unifying framework.
We derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime.
We show how the derived results can guide NAS to select the top-performing architectures, even in the case without training.
arXiv Detail & Related papers (2022-09-15T12:11:41Z) - A Survey on Surrogate-assisted Efficient Neural Architecture Search [18.914781707473296]
Neural architecture search (NAS) has become increasingly popular in the deep learning community recently.
NAS is still laborious and time-consuming because a large number of performance estimations are required during the search process of NAS.
To solve the major limitation of NAS, improving the efficiency of NAS is essential in the design of NAS.
arXiv Detail & Related papers (2022-06-03T12:02:20Z) - NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy [37.72015163462501]
We present an in-depth analysis of popular NAS algorithms and performance prediction methods across 25 different combinations of search spaces and datasets.
We introduce NAS-Bench-Suite, a comprehensive and collection of NAS benchmarks, accessible through a unified interface.
arXiv Detail & Related papers (2022-01-31T18:02:09Z) - NAS-Bench-360: Benchmarking Diverse Tasks for Neural Architecture Search [18.9676056830197]
Most existing neural architecture search (NAS) benchmarks and algorithms prioritize performance on well-studied tasks.
We present NAS-Bench-360, a benchmark suite for evaluating state-of-the-art NAS methods for convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-10-12T01:13:18Z) - TransNAS-Bench-101: Improving Transferability and Generalizability of
Cross-Task Neural Architecture Search [98.22779489340869]
We propose TransNAS-Bench-101, a benchmark dataset containing network performance across seven vision tasks.
We explore two fundamentally different types of search space: cell-level search space and macro-level search space.
With 7,352 backbones evaluated on seven tasks, 51,464 trained models with detailed training information are provided.
arXiv Detail & Related papers (2021-05-25T12:15:21Z) - AdvantageNAS: Efficient Neural Architecture Search with Credit
Assignment [23.988393741948485]
We propose a novel search strategy for one-shot and sparse propagation NAS, namely AdvantageNAS.
AdvantageNAS is a gradient-based approach that improves the search efficiency by introducing credit assignment in gradient estimation for architecture updates.
Experiments on the NAS-Bench-201 and PTB dataset show that AdvantageNAS discovers an architecture with higher performance under a limited time budget.
arXiv Detail & Related papers (2020-12-11T05:45:03Z) - NAS-FAS: Static-Dynamic Central Difference Network Search for Face
Anti-Spoofing [94.89405915373857]
Face anti-spoofing (FAS) plays a vital role in securing face recognition systems.
Existing methods rely on expert-designed networks, which may lead to a sub-optimal solution for task FAS.
Here we propose the first FAS method based on neural search (NAS), called FAS-FAS, to discover the well-suited task-aware networks.
arXiv Detail & Related papers (2020-11-03T23:34:40Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.