Structure-Enhanced Meta-Learning For Few-Shot Graph Classification
- URL: http://arxiv.org/abs/2103.03547v1
- Date: Fri, 5 Mar 2021 09:03:03 GMT
- Title: Structure-Enhanced Meta-Learning For Few-Shot Graph Classification
- Authors: Shunyu Jiang, Fuli Feng, Weijian Chen, Xiang Li, Xiangnan He
- Abstract summary: This work explores the potential of metric-based meta-learning for solving few-shot graph classification.
An implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and TRIANGLES.
- Score: 53.54066611743269
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph classification is a highly impactful task that plays a crucial role in
a myriad of real-world applications such as molecular property prediction and
protein function prediction. Aiming to handle the new classes with limited
labeled graphs, few-shot graph classification has become a bridge of existing
graph classification solutions and practical usage. This work explores the
potential of metric-based meta-learning for solving few-shot graph
classification. We highlight the importance of considering structural
characteristics in the solution and propose a novel framework which explicitly
considers global structure and local structure of the input graph. An
implementation upon GIN, named SMFGIN, is tested on two datasets, Chembl and
TRIANGLES, where extensive experiments validate the effectiveness of the
proposed method. The Chembl is constructed to fill in the gap of lacking
largescale benchmark for few-shot graph classification evaluation, which will
be released together with the implementation of SMF-GIN upon acceptance
Related papers
- Self-supervised Learning and Graph Classification under Heterophily [4.358149865548289]
We propose a novel self-supervised strategy for Pre-training Graph neural networks (GNNs) based on the Metric (PGM)
Our strategy achieves state-of-the-art performance for molecular property prediction and protein function prediction.
arXiv Detail & Related papers (2023-06-14T12:32:38Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - Semantic Graph Neural Network with Multi-measure Learning for
Semi-supervised Classification [5.000404730573809]
Graph Neural Networks (GNNs) have attracted increasing attention in recent years.
Recent studies have shown that GNNs are vulnerable to the complex underlying structure of the graph.
We propose a novel framework for semi-supervised classification.
arXiv Detail & Related papers (2022-12-04T06:17:11Z) - Weakly-supervised Graph Meta-learning for Few-shot Node Classification [53.36828125138149]
We propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN)
Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data.
Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies.
arXiv Detail & Related papers (2021-06-12T22:22:10Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Model-Agnostic Graph Regularization for Few-Shot Learning [60.64531995451357]
We present a comprehensive study on graph embedded few-shot learning.
We introduce a graph regularization approach that allows a deeper understanding of the impact of incorporating graph information between labels.
Our approach improves the performance of strong base learners by up to 2% on Mini-ImageNet and 6.7% on ImageNet-FS.
arXiv Detail & Related papers (2021-02-14T05:28:13Z) - Structured Graph Learning for Clustering and Semi-supervised
Classification [74.35376212789132]
We propose a graph learning framework to preserve both the local and global structure of data.
Our method uses the self-expressiveness of samples to capture the global structure and adaptive neighbor approach to respect the local structure.
Our model is equivalent to a combination of kernel k-means and k-means methods under certain condition.
arXiv Detail & Related papers (2020-08-31T08:41:20Z) - Adaptive-Step Graph Meta-Learner for Few-Shot Graph Classification [25.883839335786025]
We propose a novel framework consisting of a graph meta-learner, which uses GNNs based modules for fast adaptation on graph data.
Our framework gets state-of-the-art results on several few-shot graph classification tasks compared to baselines.
arXiv Detail & Related papers (2020-03-18T14:38:48Z) - Few-Shot Learning on Graphs via Super-Classes based on Graph Spectral
Measures [14.932318540666545]
We study the problem of few shot graph classification in graph neural networks (GNNs) to recognize unseen classes, given limited labeled graph examples.
We propose an approach where a probability measure is assigned to each graph based on the spectrum of the graphs normalized Laplacian.
arXiv Detail & Related papers (2020-02-27T17:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.