Lessons from the Clustering Analysis of a Search Space: A Centroid-based
Approach to Initializing NAS
- URL: http://arxiv.org/abs/2108.09126v1
- Date: Fri, 20 Aug 2021 11:46:33 GMT
- Title: Lessons from the Clustering Analysis of a Search Space: A Centroid-based
Approach to Initializing NAS
- Authors: Kalifou Rene Traore, Andr\'es Camero, Xiao Xiang Zhu
- Abstract summary: Recent availability of NAS benchmarks have enabled low computational resources prototyping.
A calibrated clustering analysis of the search space is performed.
Second, the centroids are extracted and used to initialize a NAS algorithm.
- Score: 12.901952926144258
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Lots of effort in neural architecture search (NAS) research has been
dedicated to algorithmic development, aiming at designing more efficient and
less costly methods. Nonetheless, the investigation of the initialization of
these techniques remain scare, and currently most NAS methodologies rely on
stochastic initialization procedures, because acquiring information prior to
search is costly. However, the recent availability of NAS benchmarks have
enabled low computational resources prototyping. In this study, we propose to
accelerate a NAS algorithm using a data-driven initialization technique,
leveraging the availability of NAS benchmarks. Particularly, we proposed a
two-step methodology. First, a calibrated clustering analysis of the search
space is performed. Second, the centroids are extracted and used to initialize
a NAS algorithm. We tested our proposal using Aging Evolution, an evolutionary
algorithm, on NAS-bench-101. The results show that, compared to a random
initialization, a faster convergence and a better performance of the final
solution is achieved.
Related papers
- TopoNAS: Boosting Search Efficiency of Gradient-based NAS via Topological Simplification [11.08910129925713]
TopoNAS is a model-agnostic approach for gradient-based one-shot NAS.
It significantly reduces searching time and memory usage by topological simplification of searchable paths.
arXiv Detail & Related papers (2024-08-02T15:01:29Z) - Meta-prediction Model for Distillation-Aware NAS on Unseen Datasets [55.2118691522524]
Distillation-aware Neural Architecture Search (DaNAS) aims to search for an optimal student architecture.
We propose a distillation-aware meta accuracy prediction model, DaSS (Distillation-aware Student Search), which can predict a given architecture's final performances on a dataset.
arXiv Detail & Related papers (2023-05-26T14:00:35Z) - Speeding up NAS with Adaptive Subset Selection [21.31075249079979]
We present an adaptive subset selection approach to neural architecture search (NAS)
We devise an algorithm that makes use of state-of-the-art techniques from both areas.
Our results are consistent across multiple datasets.
arXiv Detail & Related papers (2022-11-02T19:48:42Z) - A Survey on Surrogate-assisted Efficient Neural Architecture Search [18.914781707473296]
Neural architecture search (NAS) has become increasingly popular in the deep learning community recently.
NAS is still laborious and time-consuming because a large number of performance estimations are required during the search process of NAS.
To solve the major limitation of NAS, improving the efficiency of NAS is essential in the design of NAS.
arXiv Detail & Related papers (2022-06-03T12:02:20Z) - U-Boost NAS: Utilization-Boosted Differentiable Neural Architecture
Search [50.33956216274694]
optimizing resource utilization in target platforms is key to achieving high performance during DNN inference.
We propose a novel hardware-aware NAS framework that does not only optimize for task accuracy and inference latency, but also for resource utilization.
We achieve 2.8 - 4x speedup for DNN inference compared to prior hardware-aware NAS methods.
arXiv Detail & Related papers (2022-03-23T13:44:15Z) - A Data-driven Approach to Neural Architecture Search Initialization [12.901952926144258]
We propose a data-driven technique to initialize a population-based NAS algorithm.
We benchmark our proposed approach against random and Latin hypercube sampling.
arXiv Detail & Related papers (2021-11-05T14:30:19Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - Hyperparameter Optimization in Neural Networks via Structured Sparse
Recovery [54.60327265077322]
We study two important problems in the automated design of neural networks through the lens of sparse recovery methods.
In the first part of this paper, we establish a novel connection between HPO and structured sparse recovery.
In the second part of this paper, we establish a connection between NAS and structured sparse recovery.
arXiv Detail & Related papers (2020-07-07T00:57:09Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.