Self-supervised Learning and Graph Classification under Heterophily
- URL: http://arxiv.org/abs/2306.08469v1
- Date: Wed, 14 Jun 2023 12:32:38 GMT
- Title: Self-supervised Learning and Graph Classification under Heterophily
- Authors: Yilin Ding, Zhen Liu, Hao Hao
- Abstract summary: We propose a novel self-supervised strategy for Pre-training Graph neural networks (GNNs) based on the Metric (PGM)
Our strategy achieves state-of-the-art performance for molecular property prediction and protein function prediction.
- Score: 4.358149865548289
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Self-supervised learning has shown its promising capability in graph
representation learning in recent work. Most existing pre-training strategies
usually choose the popular Graph neural networks (GNNs), which can be seen as a
special form of low-pass filter, fail to effectively capture heterophily. In
this paper, we first present an experimental investigation exploring the
performance of low-pass and high-pass filters in heterophily graph
classification, where the results clearly show that high-frequency signal is
important for learning heterophily graph representation. On the other hand, it
is still unclear how to effectively capture the structural pattern of graphs
and how to measure the capability of the self-supervised pre-training strategy
in capturing graph structure. To address the problem, we first design a
quantitative metric to Measure Graph Structure (MGS), which analyzes
correlation between structural similarity and embedding similarity of graph
pairs. Then, to enhance the graph structural information captured by
self-supervised learning, we propose a novel self-supervised strategy for
Pre-training GNNs based on the Metric (PGM). Extensive experiments validate our
pre-training strategy achieves state-of-the-art performance for molecular
property prediction and protein function prediction. In addition, we find
choosing the suitable filter sometimes may be better than designing good
pre-training strategies for heterophily graph classification.
Related papers
- GraphGLOW: Universal and Generalizable Structure Learning for Graph
Neural Networks [72.01829954658889]
This paper introduces the mathematical definition of this novel problem setting.
We devise a general framework that coordinates a single graph-shared structure learner and multiple graph-specific GNNs.
The well-trained structure learner can directly produce adaptive structures for unseen target graphs without any fine-tuning.
arXiv Detail & Related papers (2023-06-20T03:33:22Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Hierarchical Adaptive Pooling by Capturing High-order Dependency for
Graph Representation Learning [18.423192209359158]
Graph neural networks (GNN) have been proven to be mature enough for handling graph-structured data on node-level graph representation learning tasks.
This paper proposes a hierarchical graph-level representation learning framework, which is adaptively sensitive to graph structures.
arXiv Detail & Related papers (2021-04-13T06:22:24Z) - Structure-Enhanced Meta-Learning For Few-Shot Graph Classification [53.54066611743269]
This work explores the potential of metric-based meta-learning for solving few-shot graph classification.
An implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and TRIANGLES.
arXiv Detail & Related papers (2021-03-05T09:03:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.