Self-supervised Learning for Heterogeneous Graph via Structure
Information based on Metapath
- URL: http://arxiv.org/abs/2209.04218v1
- Date: Fri, 9 Sep 2022 10:06:18 GMT
- Title: Self-supervised Learning for Heterogeneous Graph via Structure
Information based on Metapath
- Authors: Shuai Ma, Jian-wei Liu, Xin Zuo
- Abstract summary: Self-supervised representation learning is a potential approach to tackle this problem.
In this paper, we propose a SElfsupervised learning method for heterogeneous graph via Structure Information based on Metapath.
In order to predict jump number, SESIM uses data itself to generate labels, avoiding time-consuming manual labeling.
- Score: 9.757299837675204
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: graph neural networks (GNNs) are the dominant paradigm for modeling and
handling graph structure data by learning universal node representation. The
traditional way of training GNNs depends on a great many labeled data, which
results in high requirements on cost and time. In some special scene, it is
even unavailable and impracticable. Self-supervised representation learning,
which can generate labels by graph structure data itself, is a potential
approach to tackle this problem. And turning to research on self-supervised
learning problem for heterogeneous graphs is more challenging than dealing with
homogeneous graphs, also there are fewer studies about it. In this paper, we
propose a SElfsupervised learning method for heterogeneous graph via Structure
Information based on Metapath (SESIM). The proposed model can construct pretext
tasks by predicting jump number between nodes in each metapath to improve the
representation ability of primary task. In order to predict jump number, SESIM
uses data itself to generate labels, avoiding time-consuming manual labeling.
Moreover, predicting jump number in each metapath can effectively utilize graph
structure information, which is the essential property between nodes.
Therefore, SESIM deepens the understanding of models for graph structure. At
last, we train primary task and pretext tasks jointly, and use meta-learning to
balance the contribution of pretext tasks for primary task. Empirical results
validate the performance of SESIM method and demonstrate that this method can
improve the representation ability of traditional neural networks on link
prediction task and node classification task.
Related papers
- Learning From Graph-Structured Data: Addressing Design Issues and Exploring Practical Applications in Graph Representation Learning [2.492884361833709]
We present an exhaustive review of the latest advancements in graph representation learning and Graph Neural Networks (GNNs)
GNNs, tailored to handle graph-structured data, excel in deriving insights and predictions from intricate relational information.
Our work delves into the capabilities of GNNs, examining their foundational designs and their application in addressing real-world challenges.
arXiv Detail & Related papers (2024-11-09T19:10:33Z) - UniGraph: Learning a Unified Cross-Domain Foundation Model for Text-Attributed Graphs [30.635472655668078]
Text-Attributed Graphs (TAGs) can generalize to unseen graphs and tasks across diverse domains.
We propose a novel cascaded architecture of Language Models (LMs) and Graph Neural Networks (GNNs) as backbone networks.
We demonstrate the model's effectiveness in self-supervised representation learning on unseen graphs, few-shot in-context transfer, and zero-shot transfer.
arXiv Detail & Related papers (2024-02-21T09:06:31Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Meta-Inductive Node Classification across Graphs [6.0471030308057285]
We propose a novel meta-inductive framework called MI-GNN to customize the inductive model to each graph.
MI-GNN does not directly learn an inductive model; it learns the general knowledge of how to train a model for semi-supervised node classification on new graphs.
Extensive experiments on five real-world graph collections demonstrate the effectiveness of our proposed model.
arXiv Detail & Related papers (2021-05-14T09:16:28Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous
Graphs [21.617020380894488]
We propose a novel self-supervised auxiliary learning method to learn graph neural networks on heterogeneous graphs.
Our method can be applied to any graph neural networks in a plug-in manner without manual labeling or additional data.
arXiv Detail & Related papers (2020-07-16T12:32:11Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.