Meta Propagation Networks for Graph Few-shot Semi-supervised Learning
- URL: http://arxiv.org/abs/2112.09810v1
- Date: Sat, 18 Dec 2021 00:11:56 GMT
- Title: Meta Propagation Networks for Graph Few-shot Semi-supervised Learning
- Authors: Kaize Ding, Jianling Wang, James Caverlee and Huan Liu
- Abstract summary: We propose a novel network architecture equipped with a novel meta-learning algorithm to solve this problem.
In essence, our framework Meta-PN infers high-quality pseudo labels on unlabeled nodes via a meta-learned label propagation strategy.
Our approach offers easy and substantial performance gains compared to existing techniques on various benchmark datasets.
- Score: 39.96930762034581
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inspired by the extensive success of deep learning, graph neural networks
(GNNs) have been proposed to learn expressive node representations and
demonstrated promising performance in various graph learning tasks. However,
existing endeavors predominately focus on the conventional semi-supervised
setting where relatively abundant gold-labeled nodes are provided. While it is
often impractical due to the fact that data labeling is unbearably laborious
and requires intensive domain knowledge, especially when considering the
heterogeneity of graph-structured data. Under the few-shot semi-supervised
setting, the performance of most of the existing GNNs is inevitably undermined
by the overfitting and oversmoothing issues, largely owing to the shortage of
labeled data. In this paper, we propose a decoupled network architecture
equipped with a novel meta-learning algorithm to solve this problem. In
essence, our framework Meta-PN infers high-quality pseudo labels on unlabeled
nodes via a meta-learned label propagation strategy, which effectively augments
the scarce labeled data while enabling large receptive fields during training.
Extensive experiments demonstrate that our approach offers easy and substantial
performance gains compared to existing techniques on various benchmark
datasets.
Related papers
- Enhancing Graph Neural Networks with Limited Labeled Data by Actively Distilling Knowledge from Large Language Models [30.867447814409623]
Graph neural networks (GNNs) have great ability in node classification, a fundamental task on graphs.
We propose a novel approach that integrates Large Language Models (LLMs) and GNNs.
Our model in improving node classification accuracy with considerably limited labeled data, surpassing state-of-the-art baselines by significant margins.
arXiv Detail & Related papers (2024-07-19T02:34:10Z) - Hyperbolic Graph Neural Networks at Scale: A Meta Learning Approach [19.237565246362134]
We introduce a novel method, Hyperbolic GRAph Meta Learner (H-GRAM), for the tasks of node classification and link prediction.
H-GRAM learns transferable information from a set of support local subgraphs in the form of hyperbolic meta gradients and label hyperbolic protonets.
Our comparative analysis shows that H-GRAM effectively learns and transfers information in multiple challenging few-shot settings.
arXiv Detail & Related papers (2023-10-29T06:11:49Z) - SMARTQUERY: An Active Learning Framework for Graph Neural Networks
through Hybrid Uncertainty Reduction [25.77052028238513]
We propose a framework to learn a graph neural network with very few labeled nodes using a hybrid uncertainty reduction function.
We demonstrate the competitive performance of our method against state-of-the-arts on very few labeled data.
arXiv Detail & Related papers (2022-12-02T20:49:38Z) - Learning with Few Labeled Nodes via Augmented Graph Self-Training [36.97506256446519]
A GST (Augmented Graph Self-Training) framework is built with two new (i.e., structural and semantic) augmentation modules on top of a decoupled GST backbone.
We investigate whether this novel framework can learn an effective graph predictive model with extremely limited labeled nodes.
arXiv Detail & Related papers (2022-08-26T03:36:01Z) - Bag of Tricks for Training Deeper Graph Neural Networks: A Comprehensive
Benchmark Study [100.27567794045045]
Training deep graph neural networks (GNNs) is notoriously hard.
We present the first fair and reproducible benchmark dedicated to assessing the "tricks" of training deep GNNs.
arXiv Detail & Related papers (2021-08-24T05:00:37Z) - ROD: Reception-aware Online Distillation for Sparse Graphs [23.55530524584572]
We propose ROD, a novel reception-aware online knowledge distillation approach for sparse graph learning.
We design three supervision signals for ROD: multi-scale reception-aware graph knowledge, task-based supervision, and rich distilled knowledge.
Our approach has been extensively evaluated on 9 datasets and a variety of graph-based tasks.
arXiv Detail & Related papers (2021-07-25T11:55:47Z) - Weakly-supervised Graph Meta-learning for Few-shot Node Classification [53.36828125138149]
We propose a new graph meta-learning framework -- Graph Hallucination Networks (Meta-GHN)
Based on a new robustness-enhanced episodic training, Meta-GHN is meta-learned to hallucinate clean node representations from weakly-labeled data.
Extensive experiments demonstrate the superiority of Meta-GHN over existing graph meta-learning studies.
arXiv Detail & Related papers (2021-06-12T22:22:10Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Cyclic Label Propagation for Graph Semi-supervised Learning [52.102251202186025]
We introduce a novel framework for graph semi-supervised learning called CycProp.
CycProp integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner.
In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation.
arXiv Detail & Related papers (2020-11-24T02:55:40Z) - Knowledge-Guided Multi-Label Few-Shot Learning for General Image
Recognition [75.44233392355711]
KGGR framework exploits prior knowledge of statistical label correlations with deep neural networks.
It first builds a structured knowledge graph to correlate different labels based on statistical label co-occurrence.
Then, it introduces the label semantics to guide learning semantic-specific features.
It exploits a graph propagation network to explore graph node interactions.
arXiv Detail & Related papers (2020-09-20T15:05:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.