Survey on Graph Neural Network Acceleration: An Algorithmic Perspective
- URL: http://arxiv.org/abs/2202.04822v1
- Date: Thu, 10 Feb 2022 04:01:40 GMT
- Title: Survey on Graph Neural Network Acceleration: An Algorithmic Perspective
- Authors: Xin Liu, Mingyu Yan, Lei Deng, Guoqi Li, Xiaochun Ye, Dongrui Fan,
Shirui Pan, Yuan Xie
- Abstract summary: Graph neural networks (GNNs) have been a hot spot of recent research and are widely utilized in diverse applications.
We provide a comprehensive survey on acceleration methods for GNNs from an algorithmic perspective.
- Score: 42.88720757069427
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have been a hot spot of recent research and are
widely utilized in diverse applications. However, with the use of huger data
and deeper models, an urgent demand is unsurprisingly made to accelerate GNNs
for more efficient execution. In this paper, we provide a comprehensive survey
on acceleration methods for GNNs from an algorithmic perspective. We first
present a new taxonomy to classify existing acceleration methods into five
categories. Based on the classification, we systematically discuss these
methods and highlight their correlations. Next, we provide comparisons from
aspects of the efficiency and characteristics of these methods. Finally, we
suggest some promising prospects for future research.
Related papers
- Faster Inference Time for GNNs using coarsening [1.323700980948722]
coarsening-based methods are used to reduce the graph into a smaller one, resulting in faster computation.
No previous research has tackled the cost during the inference.
This paper presents a novel approach to improve the scalability of GNNs through subgraph-based techniques.
arXiv Detail & Related papers (2024-10-19T06:27:24Z) - Acceleration Algorithms in GNNs: A Survey [34.28669696478494]
Graph Neural Networks (GNNs) have demonstrated effectiveness in various graph-based tasks.
Their inefficiency in training and inference presents challenges for scaling up to real-world and large-scale graph applications.
A range of algorithms have been proposed to accelerate training and inference of GNNs.
arXiv Detail & Related papers (2024-05-07T08:34:33Z) - A Survey on Graph Neural Network Acceleration: Algorithms, Systems, and
Customized Hardware [30.525912505620685]
Graph neural networks (GNNs) are emerging for machine learning research on graph-structured data.
GNNs achieve state-of-the-art performance on many tasks, but they face scalability challenges when it comes to real-world applications.
We provide a taxonomy of GNN acceleration, review the existing approaches, and suggest future research directions.
arXiv Detail & Related papers (2023-06-24T20:20:45Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - An Empirical Study of Retrieval-enhanced Graph Neural Networks [48.99347386689936]
Graph Neural Networks (GNNs) are effective tools for graph representation learning.
We propose a retrieval-enhanced scheme called GRAPHRETRIEVAL, which is agnostic to the choice of graph neural network models.
We conduct comprehensive experiments over 13 datasets, and we observe that GRAPHRETRIEVAL is able to reach substantial improvements over existing GNNs.
arXiv Detail & Related papers (2022-06-01T09:59:09Z) - Automatic Relation-aware Graph Network Proliferation [182.30735195376792]
We propose Automatic Relation-aware Graph Network Proliferation (ARGNP) for efficiently searching GNNs.
These operations can extract hierarchical node/relational information and provide anisotropic guidance for message passing on a graph.
Experiments on six datasets for four graph learning tasks demonstrate that GNNs produced by our method are superior to the current state-of-the-art hand-crafted and search-based GNNs.
arXiv Detail & Related papers (2022-05-31T10:38:04Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - Tackling Oversmoothing of GNNs with Contrastive Learning [35.88575306925201]
Graph neural networks (GNNs) integrate the comprehensive relation of graph data and representation learning capability.
Oversmoothing makes the final representations of nodes indiscriminative, thus deteriorating the node classification and link prediction performance.
We propose the Topology-guided Graph Contrastive Layer, named TGCL, which is the first de-oversmoothing method maintaining all three mentioned metrics.
arXiv Detail & Related papers (2021-10-26T15:56:16Z) - Ranking Structured Objects with Graph Neural Networks [0.0]
RankGNNs are trained with a set of pair-wise preferences between graphs, suggesting that one of them is preferred over the other.
One practical application of this problem is drug screening, where an expert wants to find the most promising molecules in a large collection of drug candidates.
We empirically demonstrate that our proposed pair-wise RankGNN approach either significantly outperforms or at least matches the ranking performance of the naive point-wise baseline approach.
arXiv Detail & Related papers (2021-04-18T14:40:59Z) - Node Masking: Making Graph Neural Networks Generalize and Scale Better [71.51292866945471]
Graph Neural Networks (GNNs) have received a lot of interest in the recent times.
In this paper, we utilize some theoretical tools to better visualize the operations performed by state of the art spatial GNNs.
We introduce a simple concept, Node Masking, that allows them to generalize and scale better.
arXiv Detail & Related papers (2020-01-17T06:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.