Search For Deep Graph Neural Networks
- URL: http://arxiv.org/abs/2109.10047v1
- Date: Tue, 21 Sep 2021 09:24:59 GMT
- Title: Search For Deep Graph Neural Networks
- Authors: Guosheng Feng, Chunnan Wang, Hongzhi Wang
- Abstract summary: Current GNN-oriented NAS methods focus on the search for different layer aggregate components with shallow and simple architectures.
We propose a GNN generation pipeline with a novel two-stage search space, which aims at automatically generating high-performance.
Experiments on real-world datasets show that our generated GNN models outperforms existing manually designed and NAS-based ones.
- Score: 4.3002928862077825
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current GNN-oriented NAS methods focus on the search for different layer
aggregate components with shallow and simple architectures, which are limited
by the 'over-smooth' problem. To further explore the benefits from structural
diversity and depth of GNN architectures, we propose a GNN generation pipeline
with a novel two-stage search space, which aims at automatically generating
high-performance while transferable deep GNN models in a block-wise manner.
Meanwhile, to alleviate the 'over-smooth' problem, we incorporate multiple
flexible residual connection in our search space and apply identity mapping in
the basic GNN layers. For the search algorithm, we use deep-q-learning with
epsilon-greedy exploration strategy and reward reshaping. Extensive experiments
on real-world datasets show that our generated GNN models outperforms existing
manually designed and NAS-based ones.
Related papers
- Multicoated and Folded Graph Neural Networks with Strong Lottery Tickets [3.0894823679470087]
This paper introduces the Multi-Stage Folding and Unshared Masks methods to expand the search space in terms of both architecture and parameters.
By achieving high sparsity, competitive performance, and high memory efficiency with up to 98.7% reduction, it demonstrates suitability for energy-efficient graph processing.
arXiv Detail & Related papers (2023-12-06T02:16:44Z) - Automated Search-Space Generation Neural Architecture Search [45.902445271519596]
ASGNAS produces high-performing sub-networks in the one shot manner.
ASGNAS delivers three noticeable contributions to minimize human efforts.
The library will be released at https://github.com/tianyic/tianyic/only_train_once.
arXiv Detail & Related papers (2023-05-25T19:41:40Z) - Designing the Topology of Graph Neural Networks: A Novel Feature Fusion
Perspective [12.363386808994079]
We learn to design the topology of GNNs in a novel feature fusion perspective which is dubbed F$2$GNN.
We develop a neural architecture search method on top of the unified framework which contains a set of selection and fusion operations.
The performance gains on eight real-world datasets demonstrate the effectiveness of F$2$GNN.
arXiv Detail & Related papers (2021-12-29T13:06:12Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - Rethinking Graph Neural Network Search from Message-passing [120.62373472087651]
This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
arXiv Detail & Related papers (2021-03-26T06:10:41Z) - Simplifying Architecture Search for Graph Neural Network [38.45540097927176]
We propose SNAG framework, consisting of a novel search space and a reinforcement learning based search algorithm.
Experiments on real-world datasets demonstrate the effectiveness of SNAG framework compared to human-designed GNNs and NAS methods.
arXiv Detail & Related papers (2020-08-26T16:24:03Z) - Neural Architecture Search as Sparse Supernet [78.09905626281046]
This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search.
We model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints.
The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes.
arXiv Detail & Related papers (2020-07-31T14:51:52Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.