MECCH: Metapath Context Convolution-based Heterogeneous Graph Neural
Networks
- URL: http://arxiv.org/abs/2211.12792v2
- Date: Thu, 23 Nov 2023 16:13:10 GMT
- Title: MECCH: Metapath Context Convolution-based Heterogeneous Graph Neural
Networks
- Authors: Xinyu Fu, Irwin King
- Abstract summary: Heterogeneous graph neural networks (HGNNs) were proposed for representation learning on structural data with multiple types of nodes and edges.
We present a novel Metapath Context Convolution-based Heterogeneous Graph Neural Network (MECCH)
- Score: 45.68142605304948
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Heterogeneous graph neural networks (HGNNs) were proposed for representation
learning on structural data with multiple types of nodes and edges. To deal
with the performance degradation issue when HGNNs become deep, researchers
combine metapaths into HGNNs to associate nodes closely related in semantics
but far apart in the graph. However, existing metapath-based models suffer from
either information loss or high computation costs. To address these problems,
we present a novel Metapath Context Convolution-based Heterogeneous Graph
Neural Network (MECCH). MECCH leverages metapath contexts, a new kind of graph
structure that facilitates lossless node information aggregation while avoiding
any redundancy. Specifically, MECCH applies three novel components after
feature preprocessing to extract comprehensive information from the input graph
efficiently: (1) metapath context construction, (2) metapath context encoder,
and (3) convolutional metapath fusion. Experiments on five real-world
heterogeneous graph datasets for node classification and link prediction show
that MECCH achieves superior prediction accuracy compared with state-of-the-art
baselines with improved computational efficiency.
Related papers
- Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Heterogeneous Graph Tree Networks [8.50892442127182]
Heterogeneous graph neural networks (HGNNs) have attracted increasing research interest in recent three years.
One class is meta-path-based HGNNs which either require domain knowledge to handcraft meta-paths or consume huge amount of time and memory to automatically construct meta-paths.
We propose two models: Heterogeneous Graph Tree Convolutional Network (HetGTCN) and Heterogeneous Graph Tree Attention Network (HetGTAN)
arXiv Detail & Related papers (2022-09-01T17:22:01Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Metapaths guided Neighbors aggregated Network for?Heterogeneous Graph
Reasoning [5.228629954007088]
We propose a Metapaths-guided Neighbors-aggregated Heterogeneous Graph Neural Network to improve performance.
We conduct extensive experiments for the proposed MHN on three real-world heterogeneous graph datasets.
arXiv Detail & Related papers (2021-03-11T05:42:06Z) - Metapath- and Entity-aware Graph Neural Network for Recommendation [10.583077434945187]
In graph neural networks (GNNs) message passing iteratively aggregates nodes' information from their direct neighbors.
Such sequential node connections e.g., metapaths, capture critical insights for downstream tasks.
We employ collaborative subgraphs (CSGs) and metapaths to form metapath-aware subgraphs.
PEAGNN trains multilayer GNNs to perform metapath-aware information aggregation on such subgraphs.
arXiv Detail & Related papers (2020-10-22T15:14:30Z) - MAGNN: Metapath Aggregated Graph Neural Network for Heterogeneous Graph
Embedding [36.6390478350677]
We propose a new model named Metapath Aggregated Graph Neural Network (MAGNN) to boost the final performance.
MAGNN employs three major components, i.e., the node content transformation to encapsulate input node attributes, the intra-metapath aggregation to incorporate intermediate semantic nodes, and the inter-metapath aggregation to combine messages from multiple metapaths.
Experiments show that MAGNN achieves more accurate prediction results than state-of-the-art baselines.
arXiv Detail & Related papers (2020-02-05T08:21:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.