Contrastive Self-supervised Learning for Graph Classification
- URL: http://arxiv.org/abs/2009.05923v1
- Date: Sun, 13 Sep 2020 05:12:55 GMT
- Title: Contrastive Self-supervised Learning for Graph Classification
- Authors: Jiaqi Zeng, Pengtao Xie
- Abstract summary: We propose two approaches based on contrastive self-supervised learning (CSSL) to alleviate overfitting.
In the first approach, we use CSSL to pretrain graph encoders on widely-available unlabeled graphs without relying on human-provided labels.
In the second approach, we develop a regularizer based on CSSL, and solve the supervised classification task and the unsupervised CSSL task simultaneously.
- Score: 21.207647143672585
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph classification is a widely studied problem and has broad applications.
In many real-world problems, the number of labeled graphs available for
training classification models is limited, which renders these models prone to
overfitting. To address this problem, we propose two approaches based on
contrastive self-supervised learning (CSSL) to alleviate overfitting. In the
first approach, we use CSSL to pretrain graph encoders on widely-available
unlabeled graphs without relying on human-provided labels, then finetune the
pretrained encoders on labeled graphs. In the second approach, we develop a
regularizer based on CSSL, and solve the supervised classification task and the
unsupervised CSSL task simultaneously. To perform CSSL on graphs, given a
collection of original graphs, we perform data augmentation to create augmented
graphs out of the original graphs. An augmented graph is created by
consecutively applying a sequence of graph alteration operations. A contrastive
loss is defined to learn graph encoders by judging whether two augmented graphs
are from the same original graph. Experiments on various graph classification
datasets demonstrate the effectiveness of our proposed methods.
Related papers
- A Topology-aware Graph Coarsening Framework for Continual Graph Learning [8.136809136959302]
Continual learning on graphs tackles the problem of training a graph neural network (GNN) where graph data arrive in a streaming fashion.
Traditional continual learning strategies such as Experience Replay can be adapted to streaming graphs.
We propose TA$mathbbCO$, a (t)opology-(a)ware graph (co)arsening and (co)ntinual learning framework.
arXiv Detail & Related papers (2024-01-05T22:22:13Z) - Let There Be Order: Rethinking Ordering in Autoregressive Graph
Generation [6.422073551199993]
Conditional graph generation tasks involve training a model to generate a graph given a set of input conditions.
Many previous studies employ autoregressive models to incrementally generate graph components such as nodes and edges.
As graphs typically lack a natural ordering among their components, converting a graph into a sequence of tokens is not straightforward.
arXiv Detail & Related papers (2023-05-24T20:52:34Z) - Semi-Supervised Hierarchical Graph Classification [54.25165160435073]
We study the node classification problem in the hierarchical graph where a 'node' is a graph instance.
We propose the Hierarchical Graph Mutual Information (HGMI) and present a way to compute HGMI with theoretical guarantee.
We demonstrate the effectiveness of this hierarchical graph modeling and the proposed SEAL-CI method on text and social network data.
arXiv Detail & Related papers (2022-06-11T04:05:29Z) - Graph Self-supervised Learning with Accurate Discrepancy Learning [64.69095775258164]
We propose a framework that aims to learn the exact discrepancy between the original and the perturbed graphs, coined as Discrepancy-based Self-supervised LeArning (D-SLA)
We validate our method on various graph-related downstream tasks, including molecular property prediction, protein function prediction, and link prediction tasks, on which our model largely outperforms relevant baselines.
arXiv Detail & Related papers (2022-02-07T08:04:59Z) - Bringing Your Own View: Graph Contrastive Learning without Prefabricated
Data Augmentations [94.41860307845812]
Self-supervision is recently surging at its new frontier of graph learning.
GraphCL uses a prefabricated prior reflected by the ad-hoc manual selection of graph data augmentations.
We have extended the prefabricated discrete prior in the augmentation set, to a learnable continuous prior in the parameter space of graph generators.
We have leveraged both principles of information minimization (InfoMin) and information bottleneck (InfoBN) to regularize the learned priors.
arXiv Detail & Related papers (2022-01-04T15:49:18Z) - Graph Coarsening with Neural Networks [8.407217618651536]
We propose a framework for measuring the quality of coarsening algorithm and show that depending on the goal, we need to carefully choose the Laplace operator on the coarse graph.
Motivated by the observation that the current choice of edge weight for the coarse graph may be sub-optimal, we parametrize the weight assignment map with graph neural networks and train it to improve the coarsening quality in an unsupervised way.
arXiv Detail & Related papers (2021-02-02T06:50:07Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Certified Robustness of Graph Classification against Topology Attack
with Randomized Smoothing [22.16111584447466]
Graph-based machine learning models are vulnerable to adversarial perturbations due to the non i.i.d nature of graph data.
We build a smoothed graph classification model with certified robustness guarantee.
We also evaluate the effectiveness of our approach under graph convolutional network (GCN) based multi-class graph classification model.
arXiv Detail & Related papers (2020-09-12T22:18:54Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.