HIRE: Distilling High-order Relational Knowledge From Heterogeneous
Graph Neural Networks
- URL: http://arxiv.org/abs/2207.11887v1
- Date: Mon, 25 Jul 2022 03:25:15 GMT
- Title: HIRE: Distilling High-order Relational Knowledge From Heterogeneous
Graph Neural Networks
- Authors: Jing Liu, Tongya Zheng, and Qinfen Hao
- Abstract summary: We propose a versatile plug-and-play module, which accounts for distilling relational knowledge from pre-trained HGNNs.
To the best of our knowledge, we are the first to propose a HIgh-order RElational (HIRE) knowledge distillation framework on heterogeneous graphs.
- Score: 4.713436329217004
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Researchers have recently proposed plenty of heterogeneous graph neural
networks (HGNNs) due to the ubiquity of heterogeneous graphs in both academic
and industrial areas. Instead of pursuing a more powerful HGNN model, in this
paper, we are interested in devising a versatile plug-and-play module, which
accounts for distilling relational knowledge from pre-trained HGNNs.
To the best of our knowledge, we are the first to propose a HIgh-order
RElational (HIRE) knowledge distillation framework on heterogeneous graphs,
which can significantly boost the prediction performance regardless of model
architectures of HGNNs. Concretely, our HIRE framework initially performs
first-order node-level knowledge distillation, which encodes the semantics of
the teacher HGNN with its prediction logits. Meanwhile, the second-order
relation-level knowledge distillation imitates the relational correlation
between node embeddings of different types generated by the teacher HGNN.
Extensive experiments on various popular HGNNs models and three real-world
heterogeneous graphs demonstrate that our method obtains consistent and
considerable performance enhancement, proving its effectiveness and
generalization ability.
Related papers
- Teaching MLPs to Master Heterogeneous Graph-Structured Knowledge for Efficient and Accurate Inference [53.38082028252104]
We introduce HG2M and HG2M+ to combine both HGNN's superior performance and relational's efficient inference.
HG2M directly trains students with node features as input and soft labels from teacher HGNNs as targets.
HG2Ms demonstrate a 379.24$times$ speedup in inference over HGNNs on the large-scale IGB-3M-19 dataset.
arXiv Detail & Related papers (2024-11-21T11:39:09Z) - BG-HGNN: Toward Scalable and Efficient Heterogeneous Graph Neural
Network [6.598758004828656]
Heterogeneous graph neural networks (HGNNs) stand out as a promising neural model class designed for heterogeneous graphs.
Existing HGNNs employ different parameter spaces to model the varied relationships.
This paper introduces Blend&Grind-HGNN, which integrates different relations into a unified feature space manageable by a single set of parameters.
arXiv Detail & Related papers (2024-03-13T03:03:40Z) - Breaking the Entanglement of Homophily and Heterophily in
Semi-supervised Node Classification [25.831508778029097]
We introduce AMUD, which quantifies the relationship between node profiles and topology from a statistical perspective.
We also propose ADPA as a new directed graph learning paradigm for AMUD.
arXiv Detail & Related papers (2023-12-07T07:54:11Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - Geometric Knowledge Distillation: Topology Compression for Graph Neural
Networks [80.8446673089281]
We study a new paradigm of knowledge transfer that aims at encoding graph topological information into graph neural networks (GNNs)
We propose Neural Heat Kernel (NHK) to encapsulate the geometric property of the underlying manifold concerning the architecture of GNNs.
A fundamental and principled solution is derived by aligning NHKs on teacher and student models, dubbed as Geometric Knowledge Distillation.
arXiv Detail & Related papers (2022-10-24T08:01:58Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.