Neural Architecture Performance Prediction Using Graph Neural Networks
- URL: http://arxiv.org/abs/2010.10024v1
- Date: Mon, 19 Oct 2020 09:33:57 GMT
- Title: Neural Architecture Performance Prediction Using Graph Neural Networks
- Authors: Jovita Lukasik, David Friede, Heiner Stuckenschmidt, Margret Keuper
- Abstract summary: We propose a surrogate model for neural architecture performance prediction built upon Graph Neural Networks (GNN)
We demonstrate the effectiveness of this surrogate model on neural architecture performance prediction for structurally unknown architectures.
- Score: 17.224223176258334
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In computer vision research, the process of automating architecture
engineering, Neural Architecture Search (NAS), has gained substantial interest.
Due to the high computational costs, most recent approaches to NAS as well as
the few available benchmarks only provide limited search spaces. In this paper
we propose a surrogate model for neural architecture performance prediction
built upon Graph Neural Networks (GNN). We demonstrate the effectiveness of
this surrogate model on neural architecture performance prediction for
structurally unknown architectures (i.e. zero shot prediction) by evaluating
the GNN on several experiments on the NAS-Bench-101 dataset.
Related papers
- A General-Purpose Transferable Predictor for Neural Architecture Search [22.883809911265445]
We propose a general-purpose neural predictor for Neural Architecture Search (NAS) that can transfer across search spaces.
Experimental results on NAS-Bench-101, 201 and 301 demonstrate the efficacy of our scheme.
arXiv Detail & Related papers (2023-02-21T17:28:05Z) - Network Graph Based Neural Architecture Search [57.78724765340237]
We search neural network by rewiring the corresponding graph and predict the architecture performance by graph properties.
Because we do not perform machine learning over the entire graph space, the searching process is remarkably efficient.
arXiv Detail & Related papers (2021-12-15T00:12:03Z) - Efficient Neural Architecture Search with Performance Prediction [0.0]
We use a neural architecture search to find the best network architecture for the task at hand.
Existing NAS algorithms generally evaluate the fitness of a new architecture by fully training from scratch.
An end-to-end offline performance predictor is proposed to accelerate the evaluation of sampled architectures.
arXiv Detail & Related papers (2021-08-04T05:44:16Z) - Homogeneous Architecture Augmentation for Neural Predictor [13.35821898997164]
Neural Architecture Search (NAS) can automatically design well-performed architectures of Deep Neural Networks (DNNs) for the tasks at hand.
One bottleneck of NAS is the computational cost largely due to the expensive performance evaluation.
Despite their popularity, they also suffer a severe limitation: the shortage of annotated DNN architectures for effectively training the neural predictors.
arXiv Detail & Related papers (2021-07-28T03:46:33Z) - Self-Learning for Received Signal Strength Map Reconstruction with
Neural Architecture Search [63.39818029362661]
We present a model based on Neural Architecture Search (NAS) and self-learning for received signal strength ( RSS) map reconstruction.
The approach first finds an optimal NN architecture and simultaneously train the deduced model over some ground-truth measurements of a given ( RSS) map.
Experimental results show that signal predictions of this second model outperforms non-learning based state-of-the-art techniques and NN models with no architecture search.
arXiv Detail & Related papers (2021-05-17T12:19:22Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Weak NAS Predictors Are All You Need [91.11570424233709]
Recent predictor-based NAS approaches attempt to solve the problem with two key steps: sampling some architecture-performance pairs and fitting a proxy accuracy predictor.
We shift the paradigm from finding a complicated predictor that covers the whole architecture space to a set of weaker predictors that progressively move towards the high-performance sub-space.
Our method costs fewer samples to find the top-performance architectures on NAS-Bench-101 and NAS-Bench-201, and it achieves the state-of-the-art ImageNet performance on the NASNet search space.
arXiv Detail & Related papers (2021-02-21T01:58:43Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Bench-NLP: Neural Architecture Search Benchmark for Natural Language
Processing [12.02718579660613]
We step outside the computer vision domain by leveraging the language modeling task, which is the core of natural language processing (NLP)
We have provided search space of recurrent neural networks on the text datasets and trained 14k architectures within it.
We have conducted both intrinsic and extrinsic evaluation of the trained models using datasets for semantic relatedness and language understanding evaluation.
arXiv Detail & Related papers (2020-06-12T12:19:06Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.