GraphPNAS: Learning Distribution of Good Neural Architectures via Deep
Graph Generative Models
- URL: http://arxiv.org/abs/2211.15155v1
- Date: Mon, 28 Nov 2022 09:09:06 GMT
- Title: GraphPNAS: Learning Distribution of Good Neural Architectures via Deep
Graph Generative Models
- Authors: Muchen Li, Jeffrey Yunfan Liu, Leonid Sigal, Renjie Liao
- Abstract summary: We study neural architecture search (NAS) through the lens of learning random graph models.
We propose GraphPNAS a deep graph generative model that learns a distribution of well-performing architectures.
We show that our proposed graph generator consistently outperforms RNN-based one and achieves better or comparable performances than state-of-the-art NAS methods.
- Score: 48.57083463364353
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural architectures can be naturally viewed as computational graphs.
Motivated by this perspective, we, in this paper, study neural architecture
search (NAS) through the lens of learning random graph models. In contrast to
existing NAS methods which largely focus on searching for a single best
architecture, i.e, point estimation, we propose GraphPNAS a deep graph
generative model that learns a distribution of well-performing architectures.
Relying on graph neural networks (GNNs), our GraphPNAS can better capture
topologies of good neural architectures and relations between operators
therein. Moreover, our graph generator leads to a learnable probabilistic
search method that is more flexible and efficient than the commonly used RNN
generator and random search methods. Finally, we learn our generator via an
efficient reinforcement learning formulation for NAS. To assess the
effectiveness of our GraphPNAS, we conduct extensive experiments on three
search spaces, including the challenging RandWire on TinyImageNet, ENAS on
CIFAR10, and NAS-Bench-101/201. The complexity of RandWire is significantly
larger than other search spaces in the literature. We show that our proposed
graph generator consistently outperforms RNN-based one and achieves better or
comparable performances than state-of-the-art NAS methods.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - NAS-Bench-Graph: Benchmarking Graph Neural Architecture Search [55.75621026447599]
We propose NAS-Bench-Graph, a tailored benchmark that supports unified, reproducible, and efficient evaluations for GraphNAS.
Specifically, we construct a unified, expressive yet compact search space, covering 26,206 unique graph neural network (GNN) architectures.
Based on our proposed benchmark, the performance of GNN architectures can be directly obtained by a look-up table without any further computation.
arXiv Detail & Related papers (2022-06-18T10:17:15Z) - GraphPAS: Parallel Architecture Search for Graph Neural Networks [12.860313120881996]
We propose a parallel graph architecture search (GraphPAS) framework for graph neural networks.
In GraphPAS, we explore the search space in parallel by designing a sharing-based evolution learning.
The experimental result shows that GraphPAS outperforms state-of-art models with efficiency and accuracy simultaneously.
arXiv Detail & Related papers (2021-12-07T02:55:24Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - Simplifying Architecture Search for Graph Neural Network [38.45540097927176]
We propose SNAG framework, consisting of a novel search space and a reinforcement learning based search algorithm.
Experiments on real-world datasets demonstrate the effectiveness of SNAG framework compared to human-designed GNNs and NAS methods.
arXiv Detail & Related papers (2020-08-26T16:24:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.