Decoupled Self-supervised Learning for Non-Homophilous Graphs
- URL: http://arxiv.org/abs/2206.03601v3
- Date: Mon, 2 Oct 2023 01:33:04 GMT
- Title: Decoupled Self-supervised Learning for Non-Homophilous Graphs
- Authors: Teng Xiao, Zhengyu Chen, Zhimeng Guo, Zeyang Zhuang, Suhang Wang
- Abstract summary: We develop a decoupled self-supervised learning framework for graph neural networks.
DSSL imitates a generative process of nodes and links from latent variable modeling of the semantic structure.
Our framework is agnostic to the encoders and does not need prefabricated augmentations.
- Score: 36.87585427004317
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper studies the problem of conducting self-supervised learning for
node representation learning on graphs. Most existing self-supervised learning
methods assume the graph is homophilous, where linked nodes often belong to the
same class or have similar features. However, such assumptions of homophily do
not always hold in real-world graphs. We address this problem by developing a
decoupled self-supervised learning (DSSL) framework for graph neural networks.
DSSL imitates a generative process of nodes and links from latent variable
modeling of the semantic structure, which decouples different underlying
semantics between different neighborhoods into the self-supervised learning
process. Our DSSL framework is agnostic to the encoders and does not need
prefabricated augmentations, thus is flexible to different graphs. To
effectively optimize the framework, we derive the evidence lower bound of the
self-supervised objective and develop a scalable training algorithm with
variational inference. We provide a theoretical analysis to justify that DSSL
enjoys the better downstream performance. Extensive experiments on various
types of graph benchmarks demonstrate that our proposed framework can achieve
better performance compared with competitive baselines.
Related papers
- Self-Pro: A Self-Prompt and Tuning Framework for Graph Neural Networks [10.794305560114903]
Self-Prompt is a prompting framework for graphs based on the model and data itself.
We introduce asymmetric graph contrastive learning for pretext to address heterophily and align the objectives of pretext and downstream tasks.
We conduct extensive experiments on 11 benchmark datasets to demonstrate its superiority.
arXiv Detail & Related papers (2023-10-16T12:58:04Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Data-heterogeneity-aware Mixing for Decentralized Learning [63.83913592085953]
We characterize the dependence of convergence on the relationship between the mixing weights of the graph and the data heterogeneity across nodes.
We propose a metric that quantifies the ability of a graph to mix the current gradients.
Motivated by our analysis, we propose an approach that periodically and efficiently optimize the metric.
arXiv Detail & Related papers (2022-04-13T15:54:35Z) - Graph Neural Network with Curriculum Learning for Imbalanced Node
Classification [21.085314408929058]
Graph Neural Network (GNN) is an emerging technique for graph-based learning tasks such as node classification.
In this work, we reveal the vulnerability of GNN to the imbalance of node labels.
We propose a novel graph neural network framework with curriculum learning (GNN-CL) consisting of two modules.
arXiv Detail & Related papers (2022-02-05T10:46:11Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - A Deep Latent Space Model for Graph Representation Learning [10.914558012458425]
We propose a Deep Latent Space Model (DLSM) for directed graphs to incorporate the traditional latent variable based generative model into deep learning frameworks.
Our proposed model consists of a graph convolutional network (GCN) encoder and a decoder, which are layer-wise connected by a hierarchical variational auto-encoder architecture.
Experiments on real-world datasets show that the proposed model achieves the state-of-the-art performances on both link prediction and community detection tasks.
arXiv Detail & Related papers (2021-06-22T12:41:19Z) - SAIL: Self-Augmented Graph Contrastive Learning [40.76236706250037]
This paper studies learning node representations with graph neural networks (GNNs) for unsupervised scenario.
We derive a theoretical analysis and provide an empirical demonstration about the non-steady performance of GNNs over different graph datasets.
arXiv Detail & Related papers (2020-09-02T10:27:30Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.