Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning
- URL: http://arxiv.org/abs/2012.14722v1
- Date: Tue, 29 Dec 2020 12:12:37 GMT
- Title: Hybrid Micro/Macro Level Convolution for Heterogeneous Graph Learning
- Authors: Le Yu, Leilei Sun, Bowen Du, Chuanren Liu, Weifeng Lv, Hui Xiong
- Abstract summary: Heterogeneous graphs are pervasive in practical scenarios, where each graph consists of multiple types of nodes and edges.
Most of the existing graph convolution approaches were designed for homogeneous graphs, and therefore cannot handle heterogeneous graphs.
We propose HGConv, a novel Heterogeneous Graph Convolution approach, to learn comprehensive node representations on heterogeneous graphs.
- Score: 45.14314180743549
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graphs are pervasive in practical scenarios, where each graph
consists of multiple types of nodes and edges. Representation learning on
heterogeneous graphs aims to obtain low-dimensional node representations that
could preserve both node attributes and relation information. However, most of
the existing graph convolution approaches were designed for homogeneous graphs,
and therefore cannot handle heterogeneous graphs. Some recent methods designed
for heterogeneous graphs are also faced with several issues, including the
insufficient utilization of heterogeneous properties, structural information
loss, and lack of interpretability. In this paper, we propose HGConv, a novel
Heterogeneous Graph Convolution approach, to learn comprehensive node
representations on heterogeneous graphs with a hybrid micro/macro level
convolutional operation. Different from existing methods, HGConv could perform
convolutions on the intrinsic structure of heterogeneous graphs directly at
both micro and macro levels: A micro-level convolution to learn the importance
of nodes within the same relation, and a macro-level convolution to distinguish
the subtle difference across different relations. The hybrid strategy enables
HGConv to fully leverage heterogeneous information with proper
interpretability. Moreover, a weighted residual connection is designed to
aggregate both inherent attributes and neighbor information of the focal node
adaptively. Extensive experiments on various tasks demonstrate not only the
superiority of HGConv over existing methods, but also the intuitive
interpretability of our approach for graph analysis.
Related papers
- HiGPT: Heterogeneous Graph Language Model [27.390123898556805]
Heterogeneous graph learning aims to capture complex relationships and diverse semantics among entities in a heterogeneous graph.
Existing frameworks for heterogeneous graph learning have limitations in generalizing across diverse heterogeneous graph datasets.
We propose HiGPT, a general large graph model with Heterogeneous graph instruction-tuning paradigm.
arXiv Detail & Related papers (2024-02-25T08:07:22Z) - M2HGCL: Multi-Scale Meta-Path Integrated Heterogeneous Graph Contrastive
Learning [16.391439666603578]
We propose a new multi-scale meta-path integrated heterogeneous graph contrastive learning (M2HGCL) model.
Specifically, we expand the meta-paths and jointly aggregate the direct neighbor information, the initial meta-path neighbor information and the expanded meta-path neighbor information.
Through extensive experiments on three real-world datasets, we demonstrate that M2HGCL outperforms the current state-of-the-art baseline models.
arXiv Detail & Related papers (2023-09-03T06:39:56Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Demystifying Graph Convolution with a Simple Concatenation [6.542119695695405]
We quantify the information overlap between graph topology, node features, and labels.
We show that graph concatenation is a simple but more flexible alternative to graph convolution.
arXiv Detail & Related papers (2022-07-18T16:39:33Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Learning on heterogeneous graphs using high-order relations [37.64632406923687]
We propose an approach for learning on heterogeneous graphs without using meta-paths.
We decompose a heterogeneous graph into different homogeneous relation-type graphs, which are then combined to create higher-order relation-type representations.
arXiv Detail & Related papers (2021-03-29T12:02:47Z) - Heterogeneous Graph Transformer [49.675064816860505]
Heterogeneous Graph Transformer (HGT) architecture for modeling Web-scale heterogeneous graphs.
To handle dynamic heterogeneous graphs, we introduce the relative temporal encoding technique into HGT.
To handle Web-scale graph data, we design the heterogeneous mini-batch graph sampling algorithm---HGSampling---for efficient and scalable training.
arXiv Detail & Related papers (2020-03-03T04:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.