Rethinking Graph Neural Network Search from Message-passing
- URL: http://arxiv.org/abs/2103.14282v1
- Date: Fri, 26 Mar 2021 06:10:41 GMT
- Title: Rethinking Graph Neural Network Search from Message-passing
- Authors: Shaofei Cai, Liang Li, Jincan Deng, Beichen Zhang, Zheng-Jun Zha, Li
Su and Qingming Huang
- Abstract summary: This paper proposes Graph Neural Architecture Search (GNAS) with novel-designed search space.
We design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations.
Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.
- Score: 120.62373472087651
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Graph neural networks (GNNs) emerged recently as a standard toolkit for
learning from data on graphs. Current GNN designing works depend on immense
human expertise to explore different message-passing mechanisms, and require
manual enumeration to determine the proper message-passing depth. Inspired by
the strong searching capability of neural architecture search (NAS) in CNN,
this paper proposes Graph Neural Architecture Search (GNAS) with novel-designed
search space. The GNAS can automatically learn better architecture with the
optimal depth of message passing on the graph. Specifically, we design Graph
Neural Architecture Paradigm (GAP) with tree-topology computation procedure and
two types of fine-grained atomic operations (feature filtering and neighbor
aggregation) from message-passing mechanism to construct powerful graph network
search space. Feature filtering performs adaptive feature selection, and
neighbor aggregation captures structural information and calculates neighbors'
statistics. Experiments show that our GNAS can search for better GNNs with
multiple message-passing mechanisms and optimal message-passing depth. The
searched network achieves remarkable improvement over state-of-the-art manual
designed and search-based GNNs on five large-scale datasets at three classical
graph tasks. Codes can be found at https://github.com/phython96/GNAS-MP.
Related papers
- Towards Lightweight Graph Neural Network Search with Curriculum Graph Sparsification [48.334100429553644]
This paper proposes to design a joint graph data and architecture mechanism, which identifies important sub-architectures via the valuable graph data.
To search for optimal lightweight Graph Neural Networks (GNNs), we propose a Lightweight Graph Neural Architecture Search with Graph SparsIfication and Network Pruning (GASSIP) method.
Our method achieves on-par or even higher node classification performance with half or fewer model parameters of searched GNNs and a sparser graph.
arXiv Detail & Related papers (2024-06-24T06:53:37Z) - Efficient and Explainable Graph Neural Architecture Search via
Monte-Carlo Tree Search [5.076419064097733]
Graph neural networks (GNNs) are powerful tools for performing data science tasks in various domains.
To save human efforts and computational costs, graph neural architecture search (Graph NAS) has been used to search for a sub-optimal GNN architecture.
We propose ExGNAS, which consists of (i) a simple search space that can adapt to various graphs and (ii) a search algorithm that makes the decision process explainable.
arXiv Detail & Related papers (2023-08-30T03:21:45Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Simplifying Architecture Search for Graph Neural Network [38.45540097927176]
We propose SNAG framework, consisting of a novel search space and a reinforcement learning based search algorithm.
Experiments on real-world datasets demonstrate the effectiveness of SNAG framework compared to human-designed GNNs and NAS methods.
arXiv Detail & Related papers (2020-08-26T16:24:03Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.