Self-supervised on Graphs: Contrastive, Generative,or Predictive
- URL: http://arxiv.org/abs/2105.07342v1
- Date: Sun, 16 May 2021 03:30:03 GMT
- Title: Self-supervised on Graphs: Contrastive, Generative,or Predictive
- Authors: Lirong Wu, Haitao Lin, Zhangyang Gao, Cheng Tan, Stan.Z.Li
- Abstract summary: Self-supervised learning (SSL) is emerging as a new paradigm for extracting informative knowledge through well-designed pretext tasks.
We divide existing graph SSL methods into three categories: contrastive, generative, and predictive.
We also summarize the commonly used datasets, evaluation metrics, downstream tasks, and open-source implementations of various algorithms.
- Score: 25.679620842010422
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning on graphs has recently achieved remarkable success on a variety
of tasks while such success relies heavily on the massive and carefully labeled
data. However, precise annotations are generally very expensive and
time-consuming. To address this problem, self-supervised learning (SSL) is
emerging as a new paradigm for extracting informative knowledge through
well-designed pretext tasks without relying on manual labels. In this survey,
we extend the concept of SSL, which first emerged in the fields of computer
vision and natural language processing, to present a timely and comprehensive
review of the existing SSL techniques for graph data. Specifically, we divide
existing graph SSL methods into three categories: contrastive, generative, and
predictive. More importantly, unlike many other surveys that only provide a
high-level description of published research, we present an additional
mathematical summary of the existing works in a unified framework. Furthermore,
to facilitate methodological development and empirical comparisons, we also
summarize the commonly used datasets, evaluation metrics, downstream tasks, and
open-source implementations of various algorithms. Finally, we discuss the
technical challenges and potential future directions for improving graph
self-supervised learning.
Related papers
- Towards Graph Contrastive Learning: A Survey and Beyond [23.109430624817637]
Self-supervised learning (SSL) on graphs has gained increasing attention and has made significant progress.
SSL enables machine learning models to produce informative representations from unlabeled graph data.
Graph Contrastive Learning (GCL) has not been thoroughly investigated in the existing literature.
arXiv Detail & Related papers (2024-05-20T08:19:10Z) - Continual Learning on Graphs: Challenges, Solutions, and Opportunities [72.7886669278433]
We provide a comprehensive review of existing continual graph learning (CGL) algorithms.
We compare methods with traditional continual learning techniques and analyze the applicability of the traditional continual learning techniques to forgetting tasks.
We will maintain an up-to-date repository featuring a comprehensive list of accessible algorithms.
arXiv Detail & Related papers (2024-02-18T12:24:45Z) - Few-Shot Learning on Graphs: from Meta-learning to Pre-training and
Prompting [56.25730255038747]
This survey endeavors to synthesize recent developments, provide comparative insights, and identify future directions.
We systematically categorize existing studies into three major families: meta-learning approaches, pre-training approaches, and hybrid approaches.
We analyze the relationships among these methods and compare their strengths and limitations.
arXiv Detail & Related papers (2024-02-02T14:32:42Z) - A Survey of Data-Efficient Graph Learning [16.053913182723143]
We introduce a novel concept of Data-Efficient Graph Learning (DEGL) as a research frontier.
We systematically review recent advances on several key aspects, including self-supervised graph learning, semi-supervised graph learning, and few-shot graph learning.
arXiv Detail & Related papers (2024-02-01T09:28:48Z) - Semi-Supervised and Unsupervised Deep Visual Learning: A Survey [76.2650734930974]
Semi-supervised learning and unsupervised learning offer promising paradigms to learn from an abundance of unlabeled visual data.
We review the recent advanced deep learning algorithms on semi-supervised learning (SSL) and unsupervised learning (UL) for visual recognition from a unified perspective.
arXiv Detail & Related papers (2022-08-24T04:26:21Z) - GraphMAE: Self-Supervised Masked Graph Autoencoders [52.06140191214428]
We present a masked graph autoencoder GraphMAE that mitigates issues for generative self-supervised graph learning.
We conduct extensive experiments on 21 public datasets for three different graph learning tasks.
The results manifest that GraphMAE--a simple graph autoencoder with our careful designs--can consistently generate outperformance over both contrastive and generative state-of-the-art baselines.
arXiv Detail & Related papers (2022-05-22T11:57:08Z) - Graph Self-Supervised Learning: A Survey [73.86209411547183]
Self-supervised learning (SSL) has become a promising and trending learning paradigm for graph data.
We present a timely and comprehensive review of the existing approaches which employ SSL techniques for graph data.
arXiv Detail & Related papers (2021-02-27T03:04:21Z) - Graph-based Semi-supervised Learning: A Comprehensive Review [51.26862262550445]
Semi-supervised learning (SSL) has tremendous value in practice due to its ability to utilize both labeled data and unlabelled data.
An important class of SSL methods is to naturally represent data as graphs, which corresponds to graph-based semi-supervised learning (GSSL) methods.
GSSL methods have demonstrated their advantages in various domains due to their uniqueness of structure, the universality of applications, and their scalability to large scale data.
arXiv Detail & Related papers (2021-02-26T05:11:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.