Design Space for Graph Neural Networks
- URL: http://arxiv.org/abs/2011.08843v2
- Date: Fri, 23 Jul 2021 20:37:23 GMT
- Title: Design Space for Graph Neural Networks
- Authors: Jiaxuan You, Rex Ying, Jure Leskovec
- Abstract summary: We study the architectural design space for Graph Neural Networks (GNNs) which consists of 315,000 different designs over 32 different predictive tasks.
Our key results include: (1) A comprehensive set of guidelines for designing well-performing GNNs; (2) while best GNN designs for different tasks vary significantly, the GNN task space allows for transferring the best designs across different tasks; (3) models discovered using our design space achieve state-of-the-art performance.
- Score: 81.88707703106232
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The rapid evolution of Graph Neural Networks (GNNs) has led to a growing
number of new architectures as well as novel applications. However, current
research focuses on proposing and evaluating specific architectural designs of
GNNs, as opposed to studying the more general design space of GNNs that
consists of a Cartesian product of different design dimensions, such as the
number of layers or the type of the aggregation function. Additionally, GNN
designs are often specialized to a single task, yet few efforts have been made
to understand how to quickly find the best GNN design for a novel task or a
novel dataset. Here we define and systematically study the architectural design
space for GNNs which consists of 315,000 different designs over 32 different
predictive tasks. Our approach features three key innovations: (1) A general
GNN design space; (2) a GNN task space with a similarity metric, so that for a
given novel task/dataset, we can quickly identify/transfer the best performing
architecture; (3) an efficient and effective design space evaluation method
which allows insights to be distilled from a huge number of model-task
combinations. Our key results include: (1) A comprehensive set of guidelines
for designing well-performing GNNs; (2) while best GNN designs for different
tasks vary significantly, the GNN task space allows for transferring the best
designs across different tasks; (3) models discovered using our design space
achieve state-of-the-art performance. Overall, our work offers a principled and
scalable approach to transition from studying individual GNN designs for
specific tasks, to systematically studying the GNN design space and the task
space. Finally, we release GraphGym, a powerful platform for exploring
different GNN designs and tasks. GraphGym features modularized GNN
implementation, standardized GNN evaluation, and reproducible and scalable
experiment management.
Related papers
- Computation-friendly Graph Neural Network Design by Accumulating Knowledge on Large Language Models [20.31388126105889]
Graph Neural Networks (GNNs) have shown remarkable success but are hampered by the complexity of their architecture designs.
To reduce human workload, researchers try to develop automated algorithms to design GNNs.
arXiv Detail & Related papers (2024-08-13T08:22:01Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - PaSca: a Graph Neural Architecture Search System under the Scalable
Paradigm [24.294196319217907]
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph-based tasks.
However, GNNs do not scale well to data size and message passing steps.
This paper proposes PasCa, a new paradigm and system that offers a principled approach to systemically construct and explore the design space for scalable GNNs.
arXiv Detail & Related papers (2022-03-01T17:26:50Z) - Space4HGNN: A Novel, Modularized and Reproducible Platform to Evaluate
Heterogeneous Graph Neural Network [51.07168862821267]
We propose a unified framework covering most HGNNs, consisting of three components: heterogeneous linear transformation, heterogeneous graph transformation, and heterogeneous message passing layer.
We then build a platform Space4HGNN by defining a design space for HGNNs based on the unified framework, which offers modularized components, reproducible implementations, and standardized evaluation for HGNNs.
arXiv Detail & Related papers (2022-02-18T13:11:35Z) - Edge-featured Graph Neural Architecture Search [131.4361207769865]
We propose Edge-featured Graph Neural Architecture Search to find the optimal GNN architecture.
Specifically, we design rich entity and edge updating operations to learn high-order representations.
We show EGNAS can search better GNNs with higher performance than current state-of-the-art human-designed and searched-based GNNs.
arXiv Detail & Related papers (2021-09-03T07:53:18Z) - Third ArchEdge Workshop: Exploring the Design Space of Efficient Deep
Neural Networks [14.195694804273801]
This paper gives an overview of our ongoing work on the design space exploration of efficient deep neural networks (DNNs)
We cover two aspects: (1) static architecture design efficiency and (2) dynamic model execution efficiency.
We highlight several open questions that are poised to draw research attention in the next few years.
arXiv Detail & Related papers (2020-11-22T01:56:46Z) - Architectural Implications of Graph Neural Networks [17.01480604968118]
Graph neural networks (GNN) represent an emerging line of deep learning models that operate on graph structures.
GNN is not as well understood in the system and architecture community as its counterparts such as multi-layer perceptrons and convolutional neural networks.
arXiv Detail & Related papers (2020-09-02T03:36:24Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.