Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph
- URL: http://arxiv.org/abs/2305.10771v2
- Date: Sat, 12 Aug 2023 09:14:40 GMT
- Title: Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph
- Authors: Chenguang Du, Kaichun Yao, Hengshu Zhu, Deqing Wang, Fuzhen Zhuang and
Hui Xiong
- Abstract summary: We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
- Score: 57.2953563124339
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent years have witnessed the rapid development of heterogeneous graph
neural networks (HGNNs) in information retrieval (IR) applications. Many
existing HGNNs design a variety of tailor-made graph convolutions to capture
structural and semantic information in heterogeneous graphs. However, existing
HGNNs usually represent each node as a single vector in the multi-layer graph
convolution calculation, which makes the high-level graph convolution layer
fail to distinguish information from different relations and different orders,
resulting in the information loss in the message passing. %insufficient mining
of information. To this end, we propose a novel heterogeneous graph neural
network with sequential node representation, namely Seq-HGNN. To avoid the
information loss caused by the single vector node representation, we first
design a sequential node representation learning mechanism to represent each
node as a sequence of meta-path representations during the node message
passing. Then we propose a heterogeneous representation fusion module,
empowering Seq-HGNN to identify important meta-paths and aggregate their
representations into a compact one. We conduct extensive experiments on four
widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph
Benchmark (OGB). Experimental results show that our proposed method outperforms
state-of-the-art baselines in both accuracy and efficiency. The source code is
available at https://github.com/nobrowning/SEQ_HGNN.
Related papers
- SF-GNN: Self Filter for Message Lossless Propagation in Deep Graph Neural Network [38.669815079957566]
Graph Neural Network (GNN) with the main idea of encoding graph structure information of graphs by propagation and aggregation has developed rapidly.
It achieved excellent performance in representation learning of multiple types of graphs such as homogeneous graphs, heterogeneous graphs, and more complex graphs like knowledge graphs.
For the phenomenon of performance degradation in deep GNNs, we propose a new perspective.
arXiv Detail & Related papers (2024-07-03T02:40:39Z) - UniG-Encoder: A Universal Feature Encoder for Graph and Hypergraph Node
Classification [6.977634174845066]
A universal feature encoder for both graph and hypergraph representation learning is designed, called UniG-Encoder.
The architecture starts with a forward transformation of the topological relationships of connected nodes into edge or hyperedge features.
The encoded node embeddings are then derived from the reversed transformation, described by the transpose of the projection matrix.
arXiv Detail & Related papers (2023-08-03T09:32:50Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily [24.408557217909316]
Graph Neural Networks (GNNs) show strong expressive power on graph data mining.
However, not all graphs are homophilic, even in the same graph, the distributions may vary significantly.
We propose Meta Weight Graph Neural Network (MWGNN) to adaptively construct graph convolution layers for different nodes.
arXiv Detail & Related papers (2022-03-19T09:27:38Z) - Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Schema-Aware Deep Graph Convolutional Networks for Heterogeneous Graphs [10.526065883783899]
Graph convolutional network (GCN) based approaches have achieved significant progress for solving complex, graph-structured problems.
We propose our GCN framework 'Deep Heterogeneous Graph Convolutional Network (DHGCN)'
It takes advantage of the schema of a heterogeneous graph and uses a hierarchical approach to effectively utilize information many hops away.
arXiv Detail & Related papers (2021-05-03T06:24:27Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.