Robust Node Classification on Graphs: Jointly from Bayesian Label
Transition and Topology-based Label Propagation
- URL: http://arxiv.org/abs/2208.09779v1
- Date: Sun, 21 Aug 2022 01:56:25 GMT
- Title: Robust Node Classification on Graphs: Jointly from Bayesian Label
Transition and Topology-based Label Propagation
- Authors: Jun Zhuang, Mohammad Al Hasan
- Abstract summary: In recent years, evidence emerges that the performance of GNN-based node classification may deteriorate substantially by topological perturbations.
We propose a new label inference model, namely LInDT, which integrates both Bayesian label transition and topology-based label propagation.
Experiments on five graph datasets demonstrate the superiority of LInDT for GNN-based node classification under three scenarios of topological perturbations.
- Score: 5.037076816350975
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Node classification using Graph Neural Networks (GNNs) has been widely
applied in various real-world scenarios. However, in recent years, compelling
evidence emerges that the performance of GNN-based node classification may
deteriorate substantially by topological perturbation, such as random
connections or adversarial attacks. Various solutions, such as topological
denoising methods and mechanism design methods, have been proposed to develop
robust GNN-based node classifiers but none of these works can fully address the
problems related to topological perturbations. Recently, the Bayesian label
transition model is proposed to tackle this issue but its slow convergence may
lead to inferior performance. In this work, we propose a new label inference
model, namely LInDT, which integrates both Bayesian label transition and
topology-based label propagation for improving the robustness of GNNs against
topological perturbations. LInDT is superior to existing label transition
methods as it improves the label prediction of uncertain nodes by utilizing
neighborhood-based label propagation leading to better convergence of label
inference. Besides, LIndT adopts asymmetric Dirichlet distribution as a prior,
which also helps it to improve label inference. Extensive experiments on five
graph datasets demonstrate the superiority of LInDT for GNN-based node
classification under three scenarios of topological perturbations.
Related papers
- Graph Neural Networks with Coarse- and Fine-Grained Division for Mitigating Label Sparsity and Noise [5.943641527857957]
Graph Neural Networks (GNNs) have gained prominence in semi-supervised learning tasks in processing graph-structured data.
In real-world scenarios, labels on nodes of graphs are inevitably noisy and sparsely labeled, significantly degrading the performance of GNNs.
We propose a novel GNN-CFGD that reduces the negative impact of noisy labels via coarse- and fine-grained division, along with graph reconstruction.
arXiv Detail & Related papers (2024-11-06T08:21:26Z) - Enhancing the Resilience of Graph Neural Networks to Topological Perturbations in Sparse Graphs [9.437128738619563]
We propose a novel label inference framework, TraTopo, which combines topology-driven label propagation, Bayesian label transitions, and link analysis via random walks.
TraTopo significantly surpasses its predecessors on sparse graphs by utilizing random walk sampling, specifically targeting isolated nodes for link prediction.
Empirical evaluations highlight TraTopo's superiority in node classification, significantly exceeding contemporary GCN models in accuracy.
arXiv Detail & Related papers (2024-06-05T09:40:08Z) - Domain-adaptive Message Passing Graph Neural Network [67.35534058138387]
Cross-network node classification (CNNC) aims to classify nodes in a label-deficient target network by transferring the knowledge from a source network with abundant labels.
We propose a domain-adaptive message passing graph neural network (DM-GNN), which integrates graph neural network (GNN) with conditional adversarial domain adaptation.
arXiv Detail & Related papers (2023-08-31T05:26:08Z) - Pseudo Contrastive Learning for Graph-based Semi-supervised Learning [67.37572762925836]
Pseudo Labeling is a technique used to improve the performance of Graph Neural Networks (GNNs)
We propose a general framework for GNNs, termed Pseudo Contrastive Learning (PCL)
arXiv Detail & Related papers (2023-02-19T10:34:08Z) - NRGNN: Learning a Label Noise-Resistant Graph Neural Network on Sparsely
and Noisily Labeled Graphs [20.470934944907608]
Graph Neural Networks (GNNs) have achieved promising results for semi-supervised learning tasks on graphs such as node classification.
Many real-world graphs are often sparsely and noisily labeled, which could significantly degrade the performance of GNNs.
We propose to develop a label noise-resistant GNN for semi-supervised node classification.
arXiv Detail & Related papers (2021-06-08T22:12:44Z) - Label-GCN: An Effective Method for Adding Label Propagation to Graph
Convolutional Networks [1.8352113484137624]
We show that a modification of the first layer of a Graph Convolutional Network (GCN) can be used to effectively propagate label information across neighbor nodes.
This is done by selectively eliminating self-loops for the label features during the training phase of a GCN.
We show through several experiments that, depending on how many labels are available during the inference phase, this strategy can lead to a substantial improvement in the model performance.
arXiv Detail & Related papers (2021-04-05T21:02:48Z) - Delving Deep into Label Smoothing [112.24527926373084]
Label smoothing is an effective regularization tool for deep neural networks (DNNs)
We present an Online Label Smoothing (OLS) strategy, which generates soft labels based on the statistics of the model prediction for the target category.
arXiv Detail & Related papers (2020-11-25T08:03:11Z) - Cyclic Label Propagation for Graph Semi-supervised Learning [52.102251202186025]
We introduce a novel framework for graph semi-supervised learning called CycProp.
CycProp integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner.
In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation.
arXiv Detail & Related papers (2020-11-24T02:55:40Z) - On the Equivalence of Decoupled Graph Convolution Network and Label
Propagation [60.34028546202372]
Some work shows that coupling is inferior to decoupling, which supports deep graph propagation better.
Despite effectiveness, the working mechanisms of the decoupled GCN are not well understood.
We propose a new label propagation method named propagation then training Adaptively (PTA), which overcomes the flaws of the decoupled GCN.
arXiv Detail & Related papers (2020-10-23T13:57:39Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.