UnrealNAS: Can We Search Neural Architectures with Unreal Data?
- URL: http://arxiv.org/abs/2205.02162v1
- Date: Wed, 4 May 2022 16:30:26 GMT
- Title: UnrealNAS: Can We Search Neural Architectures with Unreal Data?
- Authors: Zhen Dong, Kaicheng Zhou, Guohao Li, Qiang Zhou, Mingfei Guo, Bernard
Ghanem, Kurt Keutzer, and Shanghang Zhang
- Abstract summary: Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
- Score: 84.78460976605425
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural architecture search (NAS) has shown great success in the automatic
design of deep neural networks (DNNs). However, the best way to use data to
search network architectures is still unclear and under exploration. Previous
work [19, 46] has analyzed the necessity of having ground-truth labels in NAS
and inspired broad interest. In this work, we take a further step to question
whether real data is necessary for NAS to be effective. The answer to this
question is important for applications with limited amount of accessible data,
and can help people improve NAS by leveraging the extra flexibility of data
generation. To explore if NAS needs real data, we construct three types of
unreal datasets using: 1) randomly labeled real images; 2) generated images and
labels; and 3) generated Gaussian noise with random labels. These datasets
facilitate to analyze the generalization and expressivity of the searched
architectures. We study the performance of architectures searched on these
constructed datasets using popular differentiable NAS methods. Extensive
experiments on CIFAR, ImageNet and CheXpert [12] show that the searched
architectures can achieve promising results compared with those derived from
the conventional NAS pipeline with real labeled data, suggesting the
feasibility of performing NAS with unreal data.
Related papers
- Fair Differentiable Neural Network Architecture Search for Long-Tailed Data with Self-Supervised Learning [0.0]
This paper explores to improve the searching and training performance of NAS on long-tailed datasets.
We first discuss the related works about NAS and the deep learning method for long-tailed datasets.
Then, we focus on an existing work, called SSF-NAS, which integrates the self-supervised learning and fair differentiable NAS.
Finally, we conducted a series of experiments on the CIFAR10-LT dataset for performance evaluation.
arXiv Detail & Related papers (2024-06-19T12:39:02Z) - GraphPNAS: Learning Distribution of Good Neural Architectures via Deep
Graph Generative Models [48.57083463364353]
We study neural architecture search (NAS) through the lens of learning random graph models.
We propose GraphPNAS a deep graph generative model that learns a distribution of well-performing architectures.
We show that our proposed graph generator consistently outperforms RNN-based one and achieves better or comparable performances than state-of-the-art NAS methods.
arXiv Detail & Related papers (2022-11-28T09:09:06Z) - Rapid Neural Architecture Search by Learning to Generate Graphs from
Datasets [42.993720854755736]
We propose an efficient Neural Search (NAS) framework that is trained once on a database consisting of datasets and pretrained networks.
We show that our model meta-learned on subsets of ImageNet-1K and architectures from NAS-Bench 201 search space successfully generalizes to multiple unseen datasets.
arXiv Detail & Related papers (2021-07-02T06:33:59Z) - Accelerating Neural Architecture Search via Proxy Data [17.86463546971522]
We propose a novel proxy data selection method tailored for neural architecture search (NAS)
executing DARTS with the proposed selection requires only 40 minutes on CIFAR-10 and 7.5 hours on ImageNet with a single GPU.
When the architecture searched on ImageNet using the proposed selection is inversely transferred to CIFAR-10, a state-of-the-art test error of 2.4% is yielded.
arXiv Detail & Related papers (2021-06-09T03:08:53Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Neural Architecture Search on ImageNet in Four GPU Hours: A
Theoretically Inspired Perspective [88.39981851247727]
We propose a novel framework called training-free neural architecture search (TE-NAS)
TE-NAS ranks architectures by analyzing the spectrum of the neural tangent kernel (NTK) and the number of linear regions in the input space.
We show that: (1) these two measurements imply the trainability and expressivity of a neural network; (2) they strongly correlate with the network's test accuracy.
arXiv Detail & Related papers (2021-02-23T07:50:44Z) - Neural Architecture Search with Random Labels [16.18010700582234]
We investigate a new variant of neural architecture search (NAS) paradigm -- searching with random labels (RLNAS)
RLNAS achieves comparable or even better results compared with state-of-the-art NAS methods such as PC-DARTS, Single Path One-Shot.
arXiv Detail & Related papers (2021-01-28T06:41:48Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.