Towards Semi-supervised Universal Graph Classification
- URL: http://arxiv.org/abs/2305.19598v1
- Date: Wed, 31 May 2023 06:58:34 GMT
- Title: Towards Semi-supervised Universal Graph Classification
- Authors: Xiao Luo, Yusheng Zhao, Yifang Qin, Wei Ju, Ming Zhang
- Abstract summary: We study the problem of semi-supervised universal graph classification.
This problem is challenging due to a severe lack of labels and potential class shifts.
We propose a novel graph neural network framework named UGNN, which makes the best of unlabeled data from the subgraph perspective.
- Score: 6.339931887475018
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks have pushed state-of-the-arts in graph classifications
recently. Typically, these methods are studied within the context of supervised
end-to-end training, which necessities copious task-specific labels. However,
in real-world circumstances, labeled data could be limited, and there could be
a massive corpus of unlabeled data, even from unknown classes as a
complementary. Towards this end, we study the problem of semi-supervised
universal graph classification, which not only identifies graph samples which
do not belong to known classes, but also classifies the remaining samples into
their respective classes. This problem is challenging due to a severe lack of
labels and potential class shifts. In this paper, we propose a novel graph
neural network framework named UGNN, which makes the best of unlabeled data
from the subgraph perspective. To tackle class shifts, we estimate the
certainty of unlabeled graphs using multiple subgraphs, which facilities the
discovery of unlabeled data from unknown categories. Moreover, we construct
semantic prototypes in the embedding space for both known and unknown
categories and utilize posterior prototype assignments inferred from the
Sinkhorn-Knopp algorithm to learn from abundant unlabeled graphs across
different subgraph views. Extensive experiments on six datasets verify the
effectiveness of UGNN in different settings.
Related papers
- Open-World Semi-Supervised Learning for Node Classification [53.07866559269709]
Open-world semi-supervised learning (Open-world SSL) for node classification is a practical but under-explored problem in the graph community.
We propose an IMbalance-Aware method named OpenIMA for Open-world semi-supervised node classification.
arXiv Detail & Related papers (2024-03-18T05:12:54Z) - Mitigating Label Noise on Graph via Topological Sample Selection [72.86862597508077]
We propose a $textitTopological Sample Selection$ (TSS) method that boosts the informative sample selection process in a graph by utilising topological information.
We theoretically prove that our procedure minimizes an upper bound of the expected risk under target clean distribution, and experimentally show the superiority of our method compared with state-of-the-art baselines.
arXiv Detail & Related papers (2024-03-04T11:24:51Z) - $\mathcal{G}^2Pxy$: Generative Open-Set Node Classification on Graphs
with Proxy Unknowns [35.976426549671075]
We propose a novel generative open-set node classification method, i.e. $mathcalG2Pxy$.
It follows a stricter inductive learning setting where no information about unknown classes is available during training and validation.
$mathcalG2Pxy$ achieves superior effectiveness for unknown class detection and known class classification.
arXiv Detail & Related papers (2023-08-10T09:42:20Z) - TGNN: A Joint Semi-supervised Framework for Graph-level Classification [34.300070497510276]
We propose a novel semi-supervised framework called Twin Graph Neural Network (TGNN)
To explore graph structural information from complementary views, our TGNN has a message passing module and a graph kernel module.
We evaluate our TGNN on various public datasets and show that it achieves strong performance.
arXiv Detail & Related papers (2023-04-23T15:42:11Z) - Stochastic Subgraph Neighborhood Pooling for Subgraph Classification [2.1270496914042996]
Subgraph Neighborhood Pooling (SSNP) jointly aggregates the subgraph and its neighborhood information without any computationally expensive operations such as labeling tricks.
Our experiments demonstrate that our models outperform current state-of-the-art methods (with a margin of up to 2%) while being up to 3X faster in training.
arXiv Detail & Related papers (2023-04-17T18:49:18Z) - Transductive Linear Probing: A Novel Framework for Few-Shot Node
Classification [56.17097897754628]
We show that transductive linear probing with self-supervised graph contrastive pretraining can outperform the state-of-the-art fully supervised meta-learning based methods under the same protocol.
We hope this work can shed new light on few-shot node classification problems and foster future research on learning from scarcely labeled instances on graphs.
arXiv Detail & Related papers (2022-12-11T21:10:34Z) - Geometer: Graph Few-Shot Class-Incremental Learning via Prototype
Representation [50.772432242082914]
Existing graph neural network based methods mainly focus on classifying unlabeled nodes within fixed classes with abundant labeling.
In this paper, we focus on this challenging but practical graph few-shot class-incremental learning (GFSCIL) problem and propose a novel method called Geometer.
Instead of replacing and retraining the fully connected neural network classifer, Geometer predicts the label of a node by finding the nearest class prototype.
arXiv Detail & Related papers (2022-05-27T13:02:07Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Handling Missing Data with Graph Representation Learning [62.59831675688714]
We propose GRAPE, a graph-based framework for feature imputation as well as label prediction.
Under GRAPE, the feature imputation is formulated as an edge-level prediction task and the label prediction as a node-level prediction task.
Experimental results on nine benchmark datasets show that GRAPE yields 20% lower mean absolute error for imputation tasks and 10% lower for label prediction tasks.
arXiv Detail & Related papers (2020-10-30T17:59:13Z) - Adaptive-Step Graph Meta-Learner for Few-Shot Graph Classification [25.883839335786025]
We propose a novel framework consisting of a graph meta-learner, which uses GNNs based modules for fast adaptation on graph data.
Our framework gets state-of-the-art results on several few-shot graph classification tasks compared to baselines.
arXiv Detail & Related papers (2020-03-18T14:38:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.