A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and
Customized Hardware
- URL: http://arxiv.org/abs/2306.14052v1
- Date: Sat, 24 Jun 2023 20:20:45 GMT
- Title: A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and
Customized Hardware
- Authors: Shichang Zhang, Atefeh Sohrabizadeh, Cheng Wan, Zijie Huang, Ziniu Hu,
Yewen Wang, Yingyan (Celine) Lin, Jason Cong, Yizhou Sun
- Abstract summary: Graph neural networks (GNNs) are emerging for machine learning research on graph-structured data.
GNNs achieve state-of-the-art performance on many tasks, but they face scalability challenges when it comes to real-world applications.
We provide a taxonomy of GNN acceleration, review the existing approaches, and suggest future research directions.
- Score: 30.525912505620685
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) are emerging for machine learning research on
graph-structured data. GNNs achieve state-of-the-art performance on many tasks,
but they face scalability challenges when it comes to real-world applications
that have numerous data and strict latency requirements. Many studies have been
conducted on how to accelerate GNNs in an effort to address these challenges.
These acceleration techniques touch on various aspects of the GNN pipeline,
from smart training and inference algorithms to efficient systems and
customized hardware. As the amount of research on GNN acceleration has grown
rapidly, there lacks a systematic treatment to provide a unified view and
address the complexity of relevant works. In this survey, we provide a taxonomy
of GNN acceleration, review the existing approaches, and suggest future
research directions. Our taxonomic treatment of GNN acceleration connects the
existing works and sets the stage for further development in this area.
Related papers
- Graph Neural Networks for Job Shop Scheduling Problems: A Survey [9.072608705759322]
Job shop scheduling problems (JSSPs) represent a critical and challenging class of optimization problems.
Recent years have witnessed a rapid increase in the application of graph neural networks (GNNs) to solve JSSPs.
This paper aims to thoroughly review prevailing GNN methods for different types of JSSPs and the closely related flow-shop scheduling problems.
arXiv Detail & Related papers (2024-06-20T08:22:07Z) - Survey of Graph Neural Network for Internet of Things and NextG Networks [3.591122855617648]
Graph Neural Networks (GNNs) have emerged as a promising paradigm for effectively modeling and extracting insights.
This survey provides a detailed description of GNN's terminologies, architecture, and the different types of GNNs.
Next, we provide a detailed account of how GNN has been leveraged for networking and tactical systems.
arXiv Detail & Related papers (2024-05-27T16:10:49Z) - Unleash Graph Neural Networks from Heavy Tuning [33.948899558876604]
Graph Neural Networks (GNNs) are deep-learning architectures designed for graph-type data.
We propose a graph conditional latent diffusion framework (GNN-Diff) to generate high-performing GNNs directly by learning from checkpoints saved during a light-tuning coarse search.
arXiv Detail & Related papers (2024-05-21T06:23:47Z) - Acceleration Algorithms in GNNs: A Survey [34.28669696478494]
Graph Neural Networks (GNNs) have demonstrated effectiveness in various graph-based tasks.
Their inefficiency in training and inference presents challenges for scaling up to real-world and large-scale graph applications.
A range of algorithms have been proposed to accelerate training and inference of GNNs.
arXiv Detail & Related papers (2024-05-07T08:34:33Z) - Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Distributed Graph Neural Network Training: A Survey [51.77035975191926]
Graph neural networks (GNNs) are a type of deep learning models that are trained on graphs and have been successfully applied in various domains.
Despite the effectiveness of GNNs, it is still challenging for GNNs to efficiently scale to large graphs.
As a remedy, distributed computing becomes a promising solution of training large-scale GNNs.
arXiv Detail & Related papers (2022-11-01T01:57:00Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Computing Graph Neural Networks: A Survey from Algorithms to
Accelerators [2.491032752533246]
Graph Neural Networks (GNNs) have exploded onto the machine learning scene in recent years owing to their capability to model and learn from graph-structured data.
This paper aims to make two main contributions: a review of the field of GNNs is presented from the perspective of computing.
An in-depth analysis of current software and hardware acceleration schemes is provided.
arXiv Detail & Related papers (2020-09-30T22:29:27Z) - Attentive Graph Neural Networks for Few-Shot Learning [74.01069516079379]
Graph Neural Networks (GNN) has demonstrated the superior performance in many challenging applications, including the few-shot learning tasks.
Despite its powerful capacity to learn and generalize the model from few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep.
We propose a novel Attentive GNN to tackle these challenges, by incorporating a triple-attention mechanism.
arXiv Detail & Related papers (2020-07-14T07:43:09Z) - Graph Neural Networks for Motion Planning [108.51253840181677]
We present two techniques, GNNs over dense fixed graphs for low-dimensional problems and sampling-based GNNs for high-dimensional problems.
We examine the ability of a GNN to tackle planning problems such as identifying critical nodes or learning the sampling distribution in Rapidly-exploring Random Trees (RRT)
Experiments with critical sampling, a pendulum and a six DoF robot arm show GNNs improve on traditional analytic methods as well as learning approaches using fully-connected or convolutional neural networks.
arXiv Detail & Related papers (2020-06-11T08:19:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.