NRGNN: Learning a Label Noise-Resistant Graph Neural Network on Sparsely
and Noisily Labeled Graphs
- URL: http://arxiv.org/abs/2106.04714v1
- Date: Tue, 8 Jun 2021 22:12:44 GMT
- Title: NRGNN: Learning a Label Noise-Resistant Graph Neural Network on Sparsely
and Noisily Labeled Graphs
- Authors: Enyan Dai, Charu Aggarwal, Suhang Wang
- Abstract summary: Graph Neural Networks (GNNs) have achieved promising results for semi-supervised learning tasks on graphs such as node classification.
Many real-world graphs are often sparsely and noisily labeled, which could significantly degrade the performance of GNNs.
We propose to develop a label noise-resistant GNN for semi-supervised node classification.
- Score: 20.470934944907608
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have achieved promising results for
semi-supervised learning tasks on graphs such as node classification. Despite
the great success of GNNs, many real-world graphs are often sparsely and
noisily labeled, which could significantly degrade the performance of GNNs, as
the noisy information could propagate to unlabeled nodes via graph structure.
Thus, it is important to develop a label noise-resistant GNN for
semi-supervised node classification. Though extensive studies have been
conducted to learn neural networks with noisy labels, they mostly focus on
independent and identically distributed data and assume a large number of noisy
labels are available, which are not directly applicable for GNNs. Thus, we
investigate a novel problem of learning a robust GNN with noisy and limited
labels. To alleviate the negative effects of label noise, we propose to link
the unlabeled nodes with labeled nodes of high feature similarity to bring more
clean label information. Furthermore, accurate pseudo labels could be obtained
by this strategy to provide more supervision and further reduce the effects of
label noise. Our theoretical and empirical analysis verify the effectiveness of
these two strategies under mild conditions. Extensive experiments on real-world
datasets demonstrate the effectiveness of the proposed method in learning a
robust GNN with noisy and limited labels.
Related papers
- Graph Neural Networks with Coarse- and Fine-Grained Division for Mitigating Label Sparsity and Noise [5.943641527857957]
Graph Neural Networks (GNNs) have gained prominence in semi-supervised learning tasks in processing graph-structured data.
In real-world scenarios, labels on nodes of graphs are inevitably noisy and sparsely labeled, significantly degrading the performance of GNNs.
We propose a novel GNN-CFGD that reduces the negative impact of noisy labels via coarse- and fine-grained division, along with graph reconstruction.
arXiv Detail & Related papers (2024-11-06T08:21:26Z) - ERASE: Error-Resilient Representation Learning on Graphs for Label Noise
Tolerance [53.73316938815873]
We propose a method called ERASE (Error-Resilient representation learning on graphs for lAbel noiSe tolerancE) to learn representations with error tolerance.
ERASE combines prototype pseudo-labels with propagated denoised labels and updates representations with error resilience.
Our method can outperform multiple baselines with clear margins in broad noise levels and enjoy great scalability.
arXiv Detail & Related papers (2023-12-13T17:59:07Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Learning on Graphs under Label Noise [5.909452203428086]
We develop a novel approach dubbed Consistent Graph Neural Network (CGNN) to solve the problem of learning on graphs with label noise.
Specifically, we employ graph contrastive learning as a regularization term, which promotes two views of augmented nodes to have consistent representations.
To detect noisy labels on the graph, we present a sample selection technique based on the homophily assumption.
arXiv Detail & Related papers (2023-06-14T01:38:01Z) - Pseudo Contrastive Learning for Graph-based Semi-supervised Learning [67.37572762925836]
Pseudo Labeling is a technique used to improve the performance of Graph Neural Networks (GNNs)
We propose a general framework for GNNs, termed Pseudo Contrastive Learning (PCL)
arXiv Detail & Related papers (2023-02-19T10:34:08Z) - Robust Training of Graph Neural Networks via Noise Governance [27.767913371777247]
Graph Neural Networks (GNNs) have become widely-used models for semi-supervised learning.
In this paper, we consider an important yet challenging scenario where labels on nodes of graphs are not only noisy but also scarce.
We propose a novel RTGNN framework that achieves better robustness by learning to explicitly govern label noise.
arXiv Detail & Related papers (2022-11-12T09:25:32Z) - Informative Pseudo-Labeling for Graph Neural Networks with Few Labels [12.83841767562179]
Graph Neural Networks (GNNs) have achieved state-of-the-art results for semi-supervised node classification on graphs.
The challenge of how to effectively learn GNNs with very few labels is still under-explored.
We propose a novel informative pseudo-labeling framework, called InfoGNN, to facilitate learning of GNNs with extremely few labels.
arXiv Detail & Related papers (2022-01-20T01:49:30Z) - Towards Robust Graph Neural Networks for Noisy Graphs with Sparse Labels [24.25945793671978]
We study a novel problem of developing robust GNNs on noisy graphs with limited labeled nodes.
Our analysis shows that both the noisy edges and limited labeled nodes could harm the message-passing mechanism of GNNs.
We propose a novel framework which adopts the noisy edges as supervision to learn a denoised and dense graph.
arXiv Detail & Related papers (2022-01-01T19:00:26Z) - Noise-robust Graph Learning by Estimating and Leveraging Pairwise
Interactions [123.07967420310796]
This paper bridges the gap by proposing a pairwise framework for noisy node classification on graphs.
PI-GNN relies on the PI as a primary learning proxy in addition to the pointwise learning from the noisy node class labels.
Our proposed framework PI-GNN contributes two novel components: (1) a confidence-aware PI estimation model that adaptively estimates the PI labels, and (2) a decoupled training approach that leverages the estimated PI labels.
arXiv Detail & Related papers (2021-06-14T14:23:08Z) - Cyclic Label Propagation for Graph Semi-supervised Learning [52.102251202186025]
We introduce a novel framework for graph semi-supervised learning called CycProp.
CycProp integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner.
In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation.
arXiv Detail & Related papers (2020-11-24T02:55:40Z) - Label-Consistency based Graph Neural Networks for Semi-supervised Node
Classification [47.753422069515366]
Graph neural networks (GNNs) achieve remarkable success in graph-based semi-supervised node classification.
In this paper, we propose label-consistency based graph neural network(LC-GNN), leveraging node pairs unconnected but with the same labels to enlarge the receptive field of nodes in GNNs.
Experiments on benchmark datasets demonstrate the proposed LC-GNN outperforms traditional GNNs in graph-based semi-supervised node classification.
arXiv Detail & Related papers (2020-07-27T11:17:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.