Going Beyond Neural Architecture Search with Sampling-based Neural
Ensemble Search
- URL: http://arxiv.org/abs/2109.02533v1
- Date: Mon, 6 Sep 2021 15:18:37 GMT
- Title: Going Beyond Neural Architecture Search with Sampling-based Neural
Ensemble Search
- Authors: Yao Shu, Yizhou Chen, Zhongxiang Dai, Bryan Kian Hsiang Low
- Abstract summary: We present two novel sampling algorithms under our Neural Ensemble Search via Sampling (NESS) framework.
Our NESS algorithms are shown to be able to achieve improved performance in both classification and adversarial defense tasks.
- Score: 31.059040393415003
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, Neural Architecture Search (NAS) has been widely applied to
automate the design of deep neural networks. Various NAS algorithms have been
proposed to reduce the search cost and improve the generalization performance
of those final selected architectures. However, these NAS algorithms aim to
select only a single neural architecture from the search spaces and thus have
overlooked the capability of other candidate architectures in helping improve
the performance of their final selected architecture. To this end, we present
two novel sampling algorithms under our Neural Ensemble Search via Sampling
(NESS) framework that can effectively and efficiently select a well-performing
ensemble of neural architectures from NAS search space. Compared with
state-of-the-art NAS algorithms and other well-known ensemble search baselines,
our NESS algorithms are shown to be able to achieve improved performance in
both classification and adversarial defense tasks on various benchmark datasets
while incurring a comparable search cost to these NAS algorithms.
Related papers
- GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - HiveNAS: Neural Architecture Search using Artificial Bee Colony
Optimization [0.0]
In this study, we evaluate the viability of Artificial Bee Colony optimization for Neural Architecture Search.
Our proposed framework, HiveNAS, outperforms existing state-of-the-art Swarm Intelligence-based NAS frameworks in a fraction of the time.
arXiv Detail & Related papers (2022-11-18T14:11:47Z) - NASI: Label- and Data-agnostic Neural Architecture Search at
Initialization [35.18069719489172]
We propose a novel NAS algorithm called NAS at Initialization (NASI)
NASI exploits the capability of a Neural Tangent Kernel in being able to characterize the converged performance of candidate architectures.
NASI also achieves competitive search effectiveness on various datasets like CIFAR-10/100 and ImageNet.
arXiv Detail & Related papers (2021-09-02T09:49:28Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - Effective, Efficient and Robust Neural Architecture Search [4.273005643715522]
Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS)
We propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration.
Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
arXiv Detail & Related papers (2020-11-19T13:46:23Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.