GraphTTA: Test Time Adaptation on Graph Neural Networks
- URL: http://arxiv.org/abs/2208.09126v1
- Date: Fri, 19 Aug 2022 02:24:16 GMT
- Title: GraphTTA: Test Time Adaptation on Graph Neural Networks
- Authors: Guanzi Chen, Jiying Zhang, Xi Xiao and Yang Li
- Abstract summary: We present a novel test time adaptation strategy named Graph Adversarial Pseudo Group Contrast (GAPGC) for graph neural networks (GNNs)
GAPGC employs a contrastive learning variant as a self-supervised task during TTA, equipped with Adversarial Learnable Augmenter and Group Pseudo-Positive Samples.
We provide theoretical evidence that GAPGC can extract minimal sufficient information for the main task from information theory perspective.
- Score: 10.582212966736645
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, test time adaptation (TTA) has attracted increasing attention due
to its power of handling the distribution shift issue in the real world. Unlike
what has been developed for convolutional neural networks (CNNs) for image
data, TTA is less explored for Graph Neural Networks (GNNs). There is still a
lack of efficient algorithms tailored for graphs with irregular structures. In
this paper, we present a novel test time adaptation strategy named Graph
Adversarial Pseudo Group Contrast (GAPGC), for graph neural networks TTA, to
better adapt to the Out Of Distribution (OOD) test data. Specifically, GAPGC
employs a contrastive learning variant as a self-supervised task during TTA,
equipped with Adversarial Learnable Augmenter and Group Pseudo-Positive Samples
to enhance the relevance between the self-supervised task and the main task,
boosting the performance of the main task. Furthermore, we provide theoretical
evidence that GAPGC can extract minimal sufficient information for the main
task from information theory perspective. Extensive experiments on molecular
scaffold OOD dataset demonstrated that the proposed approach achieves
state-of-the-art performance on GNNs.
Related papers
- Fair Graph Neural Network with Supervised Contrastive Regularization [12.666235467177131]
We propose a novel model for training fairness-aware Graph Neural Networks (GNNs)
Our approach integrates Supervised Contrastive Loss and Environmental Loss to enhance both accuracy and fairness.
arXiv Detail & Related papers (2024-04-09T07:49:05Z) - Tensor-view Topological Graph Neural Network [16.433092191206534]
Graph neural networks (GNNs) have recently gained growing attention in graph learning.
Existing GNNs only use local information from a very limited neighborhood around each node.
We propose a novel Topological Graph Neural Network (TTG-NN), a class of simple yet effective deep learning.
Real data experiments show that the proposed TTG-NN outperforms 20 state-of-the-art methods on various graph benchmarks.
arXiv Detail & Related papers (2024-01-22T14:55:01Z) - GOODAT: Towards Test-time Graph Out-of-Distribution Detection [103.40396427724667]
Graph neural networks (GNNs) have found widespread application in modeling graph data across diverse domains.
Recent studies have explored graph OOD detection, often focusing on training a specific model or modifying the data on top of a well-trained GNN.
This paper introduces a data-centric, unsupervised, and plug-and-play solution that operates independently of training data and modifications of GNN architecture.
arXiv Detail & Related papers (2024-01-10T08:37:39Z) - BHGNN-RT: Network embedding for directed heterogeneous graphs [8.7024326813104]
We propose an embedding method, a bidirectional heterogeneous graph neural network with random teleport (BHGNN-RT), for directed heterogeneous graphs.
Extensive experiments on various datasets were conducted to verify the efficacy and efficiency of BHGNN-RT.
BHGNN-RT achieves state-of-the-art performance, outperforming the benchmark methods in both node classification and unsupervised clustering tasks.
arXiv Detail & Related papers (2023-11-24T10:56:09Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.