H-GCN: A Graph Convolutional Network Accelerator on Versal ACAP
Architecture
- URL: http://arxiv.org/abs/2206.13734v1
- Date: Tue, 28 Jun 2022 03:37:31 GMT
- Title: H-GCN: A Graph Convolutional Network Accelerator on Versal ACAP
Architecture
- Authors: Chengming Zhang, Tong Geng, Anqi Guo, Jiannan Tian, Martin Herbordt,
Ang Li, Dingwen Tao
- Abstract summary: H-GCN partitions each graph into three subgraphs based on its inherent heterogeneity, and processes them using PL and AIE, respectively.
Compared with state-of-the-art GNN accelerators, H-GCN achieves, on average, speedups of 1.12.3X.
- Score: 13.149863422504332
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) have drawn tremendous attention due to their
unique capability to extend Machine Learning (ML) approaches to applications
broadly-defined as having unstructured data, especially graphs. Compared with
other Machine Learning (ML) modalities, the acceleration of Graph Neural
Networks (GNNs) is more challenging due to the irregularity and heterogeneity
derived from graph typologies. Existing efforts, however, have focused mainly
on handling graphs' irregularity and have not studied their heterogeneity.
To this end we propose H-GCN, a PL (Programmable Logic) and AIE (AI Engine)
based hybrid accelerator that leverages the emerging heterogeneity of Xilinx
Versal Adaptive Compute Acceleration Platforms (ACAPs) to achieve
high-performance GNN inference. In particular, H-GCN partitions each graph into
three subgraphs based on its inherent heterogeneity, and processes them using
PL and AIE, respectively. To further improve performance, we explore the
sparsity support of AIE and develop an efficient density-aware method to
automatically map tiles of sparse matrix-matrix multiplication (SpMM) onto the
systolic tensor array. Compared with state-of-the-art GCN accelerators, H-GCN
achieves, on average, speedups of 1.1~2.3X.
Related papers
- SiHGNN: Leveraging Properties of Semantic Graphs for Efficient HGNN Acceleration [9.85638913900595]
Heterogeneous Graph Neural Networks (HGNNs) have expanded graph representation learning to heterogeneous graph fields.
Recent studies have demonstrated their superior performance across various applications, including medical analysis and recommendation systems.
We propose a lightweight hardware accelerator for HGNNs, called SiHGNN. This accelerator incorporates a tree-based Semantic Graph Builder for efficient semantic graph generation and features a novel Graph Restructurer for optimizing semantic graph layouts.
arXiv Detail & Related papers (2024-08-27T14:20:21Z) - Efficient Topology-aware Data Augmentation for High-Degree Graph Neural Networks [2.7523980737007414]
We propose TADA, an efficient and effective front-mounted data augmentation framework for graph neural networks (GNNs) on high-degree graphs (HDGs)
Under the hood, TADA includes two key modules: (i) feature expansion with structure embeddings, and (ii) topology- and attribute-aware graph sparsification.
TADA considerably improves the predictive performance of mainstream GNN models on 8 real homophilic/heterophilic HDGs in terms of node classification.
arXiv Detail & Related papers (2024-06-08T14:14:19Z) - HGAttack: Transferable Heterogeneous Graph Adversarial Attack [63.35560741500611]
Heterogeneous Graph Neural Networks (HGNNs) are increasingly recognized for their performance in areas like the web and e-commerce.
This paper introduces HGAttack, the first dedicated gray box evasion attack method for heterogeneous graphs.
arXiv Detail & Related papers (2024-01-18T12:47:13Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Graph Contrastive Learning with Generative Adversarial Network [35.564028359355596]
Graph generative adversarial networks (GANs) learn the distribution of views for Graph Contrastive Learning (GCL)
We present GACN, a novel Generative Adversarial Contrastive learning Network for graph representation learning.
We show that GACN is able to generate high-quality augmented views for GCL and is superior to twelve state-of-the-art baseline methods.
arXiv Detail & Related papers (2023-08-01T13:28:24Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - GROW: A Row-Stationary Sparse-Dense GEMM Accelerator for
Memory-Efficient Graph Convolutional Neural Networks [4.669338722185048]
A unique property of Graph convolutional neural networks (GCNs) is that its two primary execution stages, aggregation and combination, exhibit drastically different dataflows.
We present GROW, a GCN accelerator based on Gustavson's algorithm to architect a row-wise product based sparse-dense GEMM accelerator.
arXiv Detail & Related papers (2022-03-01T00:26:31Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.