Self-Supervised Learning of Graph Neural Networks: A Unified Review
- URL: http://arxiv.org/abs/2102.10757v2
- Date: Tue, 23 Feb 2021 18:12:23 GMT
- Title: Self-Supervised Learning of Graph Neural Networks: A Unified Review
- Authors: Yaochen Xie, Zhao Xu, Zhengyang Wang, Shuiwang Ji
- Abstract summary: Self-supervised learning is emerging as a new paradigm for making use of large amounts of unlabeled samples.
We provide a unified review of different ways of training graph neural networks (GNNs) using SSL.
Our treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
- Score: 50.71341657322391
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep models trained in supervised mode have achieved remarkable success on a
variety of tasks. When labeled samples are limited, self-supervised learning
(SSL) is emerging as a new paradigm for making use of large amounts of
unlabeled samples. SSL has achieved promising performance on natural language
and image learning tasks. Recently, there is a trend to extend such success to
graph data using graph neural networks (GNNs). In this survey, we provide a
unified review of different ways of training GNNs using SSL. Specifically, we
categorize SSL methods into contrastive and predictive models. In either
category, we provide a unified framework for methods as well as how these
methods differ in each component under the framework. Our unified treatment of
SSL methods for GNNs sheds light on the similarities and differences of various
methods, setting the stage for developing new methods and algorithms. We also
summarize different SSL settings and the corresponding datasets used in each
setting. To facilitate methodological development and empirical comparison, we
develop a standardized testbed for SSL in GNNs, including implementations of
common baseline methods, datasets, and evaluation metrics.
Related papers
- A Closer Look at Benchmarking Self-Supervised Pre-training with Image Classification [51.35500308126506]
Self-supervised learning (SSL) is a machine learning approach where the data itself provides supervision, eliminating the need for external labels.
We study how classification-based evaluation protocols for SSL correlate and how well they predict downstream performance on different dataset types.
arXiv Detail & Related papers (2024-07-16T23:17:36Z) - Erasing the Bias: Fine-Tuning Foundation Models for Semi-Supervised Learning [4.137391543972184]
Semi-supervised learning (SSL) has witnessed remarkable progress, resulting in numerous method variations.
In this paper, we present a novel SSL approach named FineSSL that significantly addresses this limitation by adapting pre-trained foundation models.
We demonstrate that FineSSL sets a new state of the art for SSL on multiple benchmark datasets, reduces the training cost by over six times, and can seamlessly integrate various fine-tuning and modern SSL algorithms.
arXiv Detail & Related papers (2024-05-20T03:33:12Z) - Deep Low-Density Separation for Semi-Supervised Classification [0.0]
We introduce a novel hybrid method that applies low-density separation to the embedded features.
Our approach effectively classifies thousands of unlabeled users from a relatively small number of hand-classified examples.
arXiv Detail & Related papers (2022-05-22T11:00:55Z) - DATA: Domain-Aware and Task-Aware Pre-training [94.62676913928831]
We present DATA, a simple yet effective NAS approach specialized for self-supervised learning (SSL)
Our method achieves promising results across a wide range of computation costs on downstream tasks, including image classification, object detection and semantic segmentation.
arXiv Detail & Related papers (2022-03-17T02:38:49Z) - Graph-based Semi-supervised Learning: A Comprehensive Review [51.26862262550445]
Semi-supervised learning (SSL) has tremendous value in practice due to its ability to utilize both labeled data and unlabelled data.
An important class of SSL methods is to naturally represent data as graphs, which corresponds to graph-based semi-supervised learning (GSSL) methods.
GSSL methods have demonstrated their advantages in various domains due to their uniqueness of structure, the universality of applications, and their scalability to large scale data.
arXiv Detail & Related papers (2021-02-26T05:11:09Z) - On Data-Augmentation and Consistency-Based Semi-Supervised Learning [77.57285768500225]
Recently proposed consistency-based Semi-Supervised Learning (SSL) methods have advanced the state of the art in several SSL tasks.
Despite these advances, the understanding of these methods is still relatively limited.
arXiv Detail & Related papers (2021-01-18T10:12:31Z) - Matching Distributions via Optimal Transport for Semi-Supervised
Learning [31.533832244923843]
Semi-Supervised Learning (SSL) approaches have been an influential framework for the usage of unlabeled data.
We propose a new approach that adopts an Optimal Transport (OT) technique serving as a metric of similarity between discrete empirical probability measures.
We have evaluated our proposed method with state-of-the-art SSL algorithms on standard datasets to demonstrate the superiority and effectiveness of our SSL algorithm.
arXiv Detail & Related papers (2020-12-04T11:15:14Z) - SemiNLL: A Framework of Noisy-Label Learning by Semi-Supervised Learning [58.26384597768118]
SemiNLL is a versatile framework that combines SS strategies and SSL models in an end-to-end manner.
Our framework can absorb various SS strategies and SSL backbones, utilizing their power to achieve promising performance.
arXiv Detail & Related papers (2020-12-02T01:49:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.