Adaptive-Step Graph Meta-Learner for Few-Shot Graph Classification
- URL: http://arxiv.org/abs/2003.08246v2
- Date: Tue, 23 Jun 2020 06:22:36 GMT
- Title: Adaptive-Step Graph Meta-Learner for Few-Shot Graph Classification
- Authors: Ning Ma, Jiajun Bu, Jieyu Yang, Zhen Zhang, Chengwei Yao, Zhi Yu,
Sheng Zhou and Xifeng Yan
- Abstract summary: We propose a novel framework consisting of a graph meta-learner, which uses GNNs based modules for fast adaptation on graph data.
Our framework gets state-of-the-art results on several few-shot graph classification tasks compared to baselines.
- Score: 25.883839335786025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph classification aims to extract accurate information from
graph-structured data for classification and is becoming more and more
important in graph learning community. Although Graph Neural Networks (GNNs)
have been successfully applied to graph classification tasks, most of them
overlook the scarcity of labeled graph data in many applications. For example,
in bioinformatics, obtaining protein graph labels usually needs laborious
experiments. Recently, few-shot learning has been explored to alleviate this
problem with only given a few labeled graph samples of test classes. The shared
sub-structures between training classes and test classes are essential in
few-shot graph classification. Exiting methods assume that the test classes
belong to the same set of super-classes clustered from training classes.
However, according to our observations, the label spaces of training classes
and test classes usually do not overlap in real-world scenario. As a result,
the existing methods don't well capture the local structures of unseen test
classes. To overcome the limitation, in this paper, we propose a direct method
to capture the sub-structures with well initialized meta-learner within a few
adaptation steps. More specifically, (1) we propose a novel framework
consisting of a graph meta-learner, which uses GNNs based modules for fast
adaptation on graph data, and a step controller for the robustness and
generalization of meta-learner; (2) we provide quantitative analysis for the
framework and give a graph-dependent upper bound of the generalization error
based on our framework; (3) the extensive experiments on real-world datasets
demonstrate that our framework gets state-of-the-art results on several
few-shot graph classification tasks compared to baselines.
Related papers
- Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - A Simple Yet Effective Pretraining Strategy for Graph Few-shot Learning [38.66690010054665]
We propose a simple transductive fine-tuning based framework as a new paradigm for graph few-shot learning.
For pretraining, we propose a supervised contrastive learning framework with data augmentation strategies specific for few-shot node classification.
arXiv Detail & Related papers (2022-03-29T22:30:00Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Label-informed Graph Structure Learning for Node Classification [16.695269600529056]
We propose a novel label-informed graph structure learning framework which incorporates label information explicitly through a class transition matrix.
We conduct extensive experiments on seven node classification benchmark datasets and the results show that our method outperforms or matches the state-of-the-art baselines.
arXiv Detail & Related papers (2021-08-10T11:14:09Z) - Weakly-supervised Graph Meta-learning for Few-shot Node Classification [53.36828125138149]
We propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN)
Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data.
Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies.
arXiv Detail & Related papers (2021-06-12T22:22:10Z) - Structure-Enhanced Meta-Learning For Few-Shot Graph Classification [53.54066611743269]
This work explores the potential of metric-based meta-learning for solving few-shot graph classification.
An implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and TRIANGLES.
arXiv Detail & Related papers (2021-03-05T09:03:03Z) - Model-Agnostic Graph Regularization for Few-Shot Learning [60.64531995451357]
We present a comprehensive study on graph embedded few-shot learning.
We introduce a graph regularization approach that allows a deeper understanding of the impact of incorporating graph information between labels.
Our approach improves the performance of strong base learners by up to 2% on Mini-ImageNet and 6.7% on ImageNet-FS.
arXiv Detail & Related papers (2021-02-14T05:28:13Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Few-Shot Learning on Graphs via Super-Classes based on Graph Spectral
Measures [14.932318540666545]
We study the problem of few shot graph classification in graph neural networks (GNNs) to recognize unseen classes, given limited labeled graph examples.
We propose an approach where a probability measure is assigned to each graph based on the spectrum of the graphs normalized Laplacian.
arXiv Detail & Related papers (2020-02-27T17:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.