BG-HGNN: Toward Scalable and Efficient Heterogeneous Graph Neural
Network
- URL: http://arxiv.org/abs/2403.08207v1
- Date: Wed, 13 Mar 2024 03:03:40 GMT
- Title: BG-HGNN: Toward Scalable and Efficient Heterogeneous Graph Neural
Network
- Authors: Junwei Su, Lingjun Mao, Chuan Wu
- Abstract summary: Heterogeneous graph neural networks (HGNNs) stand out as a promising neural model class designed for heterogeneous graphs.
Existing HGNNs employ different parameter spaces to model the varied relationships.
This paper introduces Blend&Grind-HGNN, which integrates different relations into a unified feature space manageable by a single set of parameters.
- Score: 6.598758004828656
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many computer vision and machine learning problems are modelled as learning
tasks on heterogeneous graphs, featuring a wide array of relations from diverse
types of nodes and edges. Heterogeneous graph neural networks (HGNNs) stand out
as a promising neural model class designed for heterogeneous graphs. Built on
traditional GNNs, existing HGNNs employ different parameter spaces to model the
varied relationships. However, the practical effectiveness of existing HGNNs is
often limited to simple heterogeneous graphs with few relation types. This
paper first highlights and demonstrates that the standard approach employed by
existing HGNNs inevitably leads to parameter explosion and relation collapse,
making HGNNs less effective or impractical for complex heterogeneous graphs
with numerous relation types. To overcome this issue, we introduce a novel
framework, Blend&Grind-HGNN (BG-HGNN), which effectively tackles the challenges
by carefully integrating different relations into a unified feature space
manageable by a single set of parameters. This results in a refined HGNN method
that is more efficient and effective in learning from heterogeneous graphs,
especially when the number of relations grows. Our empirical studies illustrate
that BG-HGNN significantly surpasses existing HGNNs in terms of parameter
efficiency (up to 28.96 $\times$), training throughput (up to 8.12 $\times$),
and accuracy (up to 1.07 $\times$).
Related papers
- First-order PDES for Graph Neural Networks: Advection And Burgers Equation Models [1.4174475093445238]
This paper presents new Graph Neural Network models that incorporate two first-order Partial Differential Equations (PDEs)
Our experimental findings highlight the capacity of our new PDE model to achieve comparable results with higher-order PDE models and fix the over-smoothing problem up to 64 layers.
Results underscore the adaptability and versatility of GNNs, indicating that unconventional approaches can yield outcomes on par with established techniques.
arXiv Detail & Related papers (2024-04-03T21:47:02Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - MGNN: Graph Neural Networks Inspired by Distance Geometry Problem [28.789684784093048]
Graph Neural Networks (GNNs) have emerged as a prominent research topic in the field of machine learning.
In this paper, we propose a GNN model inspired by the congruent-inphilic property of the classifiers in the classification phase of GNNs.
We extensively evaluate the effectiveness of our model through experiments conducted on both synthetic and real-world datasets.
arXiv Detail & Related papers (2022-01-31T04:15:42Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z) - Random Features Strengthen Graph Neural Networks [40.60905158071766]
Graph neural networks (GNNs) are powerful machine learning models for various graph learning tasks.
In this paper, we demonstrate that GNNs become powerful just by adding a random feature to each node.
We show that the addition of random features enables GNNs to solve various problems that normal GNNs, including the graph convolutional networks (GCNs) and graph isomorphism networks (GINs) cannot solve.
arXiv Detail & Related papers (2020-02-08T12:47:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.