Probabilistic Dual Network Architecture Search on Graphs
- URL: http://arxiv.org/abs/2003.09676v1
- Date: Sat, 21 Mar 2020 15:06:47 GMT
- Title: Probabilistic Dual Network Architecture Search on Graphs
- Authors: Yiren Zhao, Duo Wang, Xitong Gao, Robert Mullins, Pietro Lio, Mateja
Jamnik
- Abstract summary: We present the first differentiable Network Architecture Search (NAS) for Graph Neural Networks (GNNs)
GNNs show promising performance on a wide range of tasks, but require a large amount of architecture engineering.
We use a fully gradient-based search approach to update architectural parameters, making it the first differentiable graph NAS method.
- Score: 13.140262952697746
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present the first differentiable Network Architecture Search (NAS) for
Graph Neural Networks (GNNs). GNNs show promising performance on a wide range
of tasks, but require a large amount of architecture engineering. First, graphs
are inherently a non-Euclidean and sophisticated data structure, leading to
poor adaptivity of GNN architectures across different datasets. Second, a
typical graph block contains numerous different components, such as aggregation
and attention, generating a large combinatorial search space. To counter these
problems, we propose a Probabilistic Dual Network Architecture Search (PDNAS)
framework for GNNs. PDNAS not only optimises the operations within a single
graph block (micro-architecture), but also considers how these blocks should be
connected to each other (macro-architecture). The dual architecture (micro- and
marco-architectures) optimisation allows PDNAS to find deeper GNNs on diverse
datasets with better performance compared to other graph NAS methods. Moreover,
we use a fully gradient-based search approach to update architectural
parameters, making it the first differentiable graph NAS method. PDNAS
outperforms existing hand-designed GNNs and NAS results, for example, on the
PPI dataset, PDNAS beats its best competitors by 1.67 and 0.17 in F1 scores.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Efficacy of Neural Prediction-Based Zero-Shot NAS [0.04096453902709291]
We propose a novel approach for zero-shot Neural Architecture Search (NAS) using deep learning.
Our method employs Fourier sum of sines encoding for convolutional kernels, enabling the construction of a computational feed-forward graph with a structure similar to the architecture under evaluation.
Experimental results show that our approach surpasses previous methods using graph convolutional networks in terms of correlation on the NAS-Bench-201 dataset and exhibits a higher convergence rate.
arXiv Detail & Related papers (2023-08-31T14:54:06Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - GraphPNAS: Learning Distribution of Good Neural Architectures via Deep
Graph Generative Models [48.57083463364353]
We study neural architecture search (NAS) through the lens of learning random graph models.
We propose GraphPNAS a deep graph generative model that learns a distribution of well-performing architectures.
We show that our proposed graph generator consistently outperforms RNN-based one and achieves better or comparable performances than state-of-the-art NAS methods.
arXiv Detail & Related papers (2022-11-28T09:09:06Z) - Architecture Augmentation for Performance Predictor Based on Graph
Isomorphism [15.478663248038307]
We propose an effective deep neural network (DNN) architecture augmentation method named GIAug.
We show that GIAug can significantly enhance the performance of most state-of-the-art peer predictors.
In addition, GIAug can save three magnitude order of computation cost at most on ImageNet.
arXiv Detail & Related papers (2022-07-03T09:04:09Z) - NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search [55.75621026447599]
We propose NAS-Bench-Graph, a tailored benchmark that supports unified, reproducible, and efficient evaluations for GraphNAS.
Specifically, we construct a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures.
Based on our proposed benchmark, the performance of GNN architectures can be directly obtained by a look-up table without any further computation.
arXiv Detail & Related papers (2022-06-18T10:17:15Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - Arch-Graph: Acyclic Architecture Relation Predictor for
Task-Transferable Neural Architecture Search [96.31315520244605]
Arch-Graph is a transferable NAS method that predicts task-specific optimal architectures.
We show Arch-Graph's transferability and high sample efficiency across numerous tasks.
It is able to find top 0.16% and 0.29% architectures on average on two search spaces under the budget of only 50 models.
arXiv Detail & Related papers (2022-04-12T16:46:06Z) - Weak NAS Predictors Are All You Need [91.11570424233709]
Recent predictor-based NAS approaches attempt to solve the problem with two key steps: sampling some architecture-performance pairs and fitting a proxy accuracy predictor.
We shift the paradigm from finding a complicated predictor that covers the whole architecture space to a set of weaker predictors that progressively move towards the high-performance sub-space.
Our method costs fewer samples to find the top-performance architectures on NAS-Bench-101 and NAS-Bench-201, and it achieves the state-of-the-art ImageNet performance on the NASNet search space.
arXiv Detail & Related papers (2021-02-21T01:58:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.