Neural Network Design: Learning from Neural Architecture Search
- URL: http://arxiv.org/abs/2011.00521v1
- Date: Sun, 1 Nov 2020 15:02:02 GMT
- Title: Neural Network Design: Learning from Neural Architecture Search
- Authors: Bas van Stein and Hao Wang and Thomas B\"ack
- Abstract summary: Neural Architecture Search (NAS) aims to optimize deep neural networks' architecture for better accuracy or smaller computational cost.
Despite various successful approaches proposed to solve the NAS task, the landscape of it, along with its properties, are rarely investigated.
- Score: 3.9430294028981763
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) aims to optimize deep neural networks'
architecture for better accuracy or smaller computational cost and has recently
gained more research interests. Despite various successful approaches proposed
to solve the NAS task, the landscape of it, along with its properties, are
rarely investigated. In this paper, we argue for the necessity of studying the
landscape property thereof and propose to use the so-called Exploratory
Landscape Analysis (ELA) techniques for this goal. Taking a broad set of
designs of the deep convolutional network, we conduct extensive experimentation
to obtain their performance. Based on our analysis of the experimental results,
we observed high similarities between well-performing architecture designs,
which is then used to significantly narrow the search space to improve the
efficiency of any NAS algorithm. Moreover, we extract the ELA features over the
NAS landscapes on three common image classification data sets, MNIST, Fashion,
and CIFAR-10, which shows that the NAS landscape can be distinguished for those
three data sets. Also, when comparing to the ELA features of the well-known
Black-Box Optimization Benchmarking (BBOB) problem set, we found out that the
NAS landscapes surprisingly form a new problem class on its own, which can be
separated from all $24$ BBOB problems. Given this interesting observation, we,
therefore, state the importance of further investigation on selecting an
efficient optimizer for the NAS landscape as well as the necessity of
augmenting the current benchmark problem set.
Related papers
- FlatNAS: optimizing Flatness in Neural Architecture Search for
Out-of-Distribution Robustness [3.724847012963521]
This study introduces a novel NAS solution, called Flat Neural Architecture Search (FlatNAS)
It explores the interplay between a novel figure of merit based on robustness to weight perturbations and single NN optimization with Sharpness-Aware Minimization (SAM)
The OOD robustness of the NAS-designed models is evaluated by focusing on robustness to input data corruptions, using popular benchmark datasets in the literature.
arXiv Detail & Related papers (2024-02-29T12:33:14Z) - DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks [6.628409795264665]
We present the next-generation neural architecture design for computationally efficient neural architecture distillation - DONNAv2.
DONNAv2 reduces the computational cost of DONNA by 10x for the larger datasets.
To improve the quality of NAS search space, DONNAv2 leverages a block knowledge distillation filter to remove blocks with high inference costs.
arXiv Detail & Related papers (2023-09-26T04:48:50Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - Effective, Efficient and Robust Neural Architecture Search [4.273005643715522]
Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS)
We propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration.
Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
arXiv Detail & Related papers (2020-11-19T13:46:23Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z) - DCNAS: Densely Connected Neural Architecture Search for Semantic Image
Segmentation [44.46852065566759]
We propose a Densely Connected NAS (DCNAS) framework, which directly searches the optimal network structures for the multi-scale representations of visual information.
Specifically, by connecting cells with each other using learnable weights, we introduce a densely connected search space to cover an abundance of mainstream network designs.
We demonstrate that the architecture obtained from our DCNAS algorithm achieves state-of-the-art performances on public semantic image segmentation benchmarks.
arXiv Detail & Related papers (2020-03-26T13:21:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.