Cross-view Self-Supervised Learning on Heterogeneous Graph Neural
Network via Bootstrapping
- URL: http://arxiv.org/abs/2201.03340v2
- Date: Tue, 11 Jan 2022 14:35:36 GMT
- Title: Cross-view Self-Supervised Learning on Heterogeneous Graph Neural
Network via Bootstrapping
- Authors: Minjae Park
- Abstract summary: Heterogeneous graph neural networks can represent information of heterogeneous graphs with excellent ability.
In this paper, we introduce a that can generate good representations without generating large number of pairs.
The proposed model showed state-of-the-art performance than other methods in various real world datasets.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Heterogeneous graph neural networks can represent information of
heterogeneous graphs with excellent ability. Recently, self-supervised learning
manner is researched which learns the unique expression of a graph through a
contrastive learning method. In the absence of labels, this learning methods
show great potential. However, contrastive learning relies heavily on positive
and negative pairs, and generating high-quality pairs from heterogeneous graphs
is difficult. In this paper, in line with recent innovations in self-supervised
learning called BYOL or bootstrapping, we introduce a that can generate good
representations without generating large number of pairs. In addition, paying
attention to the fact that heterogeneous graphs can be viewed from two
perspectives, network schema and meta-path views, high-level expressions in the
graphs are captured and expressed. The proposed model showed state-of-the-art
performance than other methods in various real world datasets.
Related papers
- Disentangled Generative Graph Representation Learning [51.59824683232925]
This paper introduces DiGGR (Disentangled Generative Graph Representation Learning), a self-supervised learning framework.
It aims to learn latent disentangled factors and utilize them to guide graph mask modeling.
Experiments on 11 public datasets for two different graph learning tasks demonstrate that DiGGR consistently outperforms many previous self-supervised methods.
arXiv Detail & Related papers (2024-08-24T05:13:02Z) - Imbalanced Graph Classification with Multi-scale Oversampling Graph Neural Networks [25.12261412297796]
We introduce a novel multi-scale oversampling graph neural network (MOSGNN) that learns expressive minority graph representations.
It achieves this by jointly optimizing subgraph-level, graph-level, and pairwise-graph learning tasks.
Experiments on 16 imbalanced graph datasets show that MOSGNN i) significantly outperforms five state-of-the-art models.
arXiv Detail & Related papers (2024-05-08T09:16:54Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - RHCO: A Relation-aware Heterogeneous Graph Neural Network with
Contrastive Learning for Large-scale Graphs [26.191673964156585]
We propose a novel Relation-aware Heterogeneous Graph Neural Network with Contrastive Learning (RHCO) for large-scale heterogeneous graph representation learning.
RHCO achieves best performance over the state-of-the-art models.
arXiv Detail & Related papers (2022-11-20T04:45:04Z) - CGMN: A Contrastive Graph Matching Network for Self-Supervised Graph
Similarity Learning [65.1042892570989]
We propose a contrastive graph matching network (CGMN) for self-supervised graph similarity learning.
We employ two strategies, namely cross-view interaction and cross-graph interaction, for effective node representation learning.
We transform node representations into graph-level representations via pooling operations for graph similarity computation.
arXiv Detail & Related papers (2022-05-30T13:20:26Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z) - Graph Representation Learning by Ensemble Aggregating Subgraphs via
Mutual Information Maximization [5.419711903307341]
We introduce a self-supervised learning method to enhance the representations of graph-level learned by Graph Neural Networks.
To get a comprehensive understanding of the graph structure, we propose an ensemble-learning like subgraph method.
And to achieve efficient and effective contrasive learning, a Head-Tail contrastive samples construction method is proposed.
arXiv Detail & Related papers (2021-03-24T12:06:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.