Cyclic Label Propagation for Graph Semi-supervised Learning
- URL: http://arxiv.org/abs/2011.11860v1
- Date: Tue, 24 Nov 2020 02:55:40 GMT
- Title: Cyclic Label Propagation for Graph Semi-supervised Learning
- Authors: Zhao Li, Yixin Liu, Zhen Zhang, Shirui Pan, Jianliang Gao, Jiajun Bu
- Abstract summary: We introduce a novel framework for graph semi-supervised learning called CycProp.
CycProp integrates GNNs into the process of label propagation in a cyclic and mutually reinforcing manner.
In particular, our proposed CycProp updates the node embeddings learned by GNN module with the augmented information by label propagation.
- Score: 52.102251202186025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have emerged as effective approaches for graph
analysis, especially in the scenario of semi-supervised learning. Despite its
success, GNN often suffers from over-smoothing and over-fitting problems, which
affects its performance on node classification tasks. We analyze that an
alternative method, the label propagation algorithm (LPA), avoids the
aforementioned problems thus it is a promising choice for graph semi-supervised
learning. Nevertheless, the intrinsic limitations of LPA on feature
exploitation and relation modeling make propagating labels become less
effective. To overcome these limitations, we introduce a novel framework for
graph semi-supervised learning termed as Cyclic Label Propagation (CycProp for
abbreviation), which integrates GNNs into the process of label propagation in a
cyclic and mutually reinforcing manner to exploit the advantages of both GNNs
and LPA. In particular, our proposed CycProp updates the node embeddings
learned by GNN module with the augmented information by label propagation,
while fine-tunes the weighted graph of label propagation with the help of node
embedding in turn. After the model converges, reliably predicted labels and
informative node embeddings are obtained with the LPA and GNN modules
respectively. Extensive experiments on various real-world datasets are
conducted, and the experimental results empirically demonstrate that the
proposed CycProp model can achieve relatively significant gains over the
state-of-the-art methods.
Related papers
- DFA-GNN: Forward Learning of Graph Neural Networks by Direct Feedback Alignment [57.62885438406724]
Graph neural networks are recognized for their strong performance across various applications.
BP has limitations that challenge its biological plausibility and affect the efficiency, scalability and parallelism of training neural networks for graph-based tasks.
We propose DFA-GNN, a novel forward learning framework tailored for GNNs with a case study of semi-supervised learning.
arXiv Detail & Related papers (2024-06-04T07:24:51Z) - A GAN Approach for Node Embedding in Heterogeneous Graphs Using Subgraph
Sampling [35.94125831564648]
Our research addresses class imbalance issues in heterogeneous graphs using graph neural networks (GNNs)
We propose a novel method combining the strengths of Generative Adversarial Networks (GANs) with GNNs, creating synthetic nodes and edges that effectively balance the dataset.
arXiv Detail & Related papers (2023-12-11T16:52:20Z) - Breaking the Entanglement of Homophily and Heterophily in
Semi-supervised Node Classification [25.831508778029097]
We introduce AMUD, which quantifies the relationship between node profiles and topology from a statistical perspective.
We also propose ADPA as a new directed graph learning paradigm for AMUD.
arXiv Detail & Related papers (2023-12-07T07:54:11Z) - Nonlinear Correct and Smooth for Semi-Supervised Learning [1.622641093702668]
Graph-based semi-supervised learning (GSSL) has been used successfully in various applications.
We propose Correct and Smooth (NLCS), which improves the existing post-processing approach by incorporating non-linearity and higher-order representation.
arXiv Detail & Related papers (2023-10-09T14:33:32Z) - Label Deconvolution for Node Representation Learning on Large-scale
Attributed Graphs against Learning Bias [75.44877675117749]
We propose an efficient label regularization technique, namely Label Deconvolution (LD), to alleviate the learning bias by a novel and highly scalable approximation to the inverse mapping of GNNs.
Experiments demonstrate LD significantly outperforms state-of-the-art methods on Open Graph datasets Benchmark.
arXiv Detail & Related papers (2023-09-26T13:09:43Z) - Bregman Graph Neural Network [27.64062763929748]
In node classification tasks, the smoothing effect induced by GNNs tends to assimilate representations and over-homogenize labels of connected nodes.
We propose a novel bilevel optimization framework for GNNs inspired by the notion of Bregman distance.
arXiv Detail & Related papers (2023-09-12T23:54:24Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Noise-robust Graph Learning by Estimating and Leveraging Pairwise
Interactions [123.07967420310796]
This paper bridges the gap by proposing a pairwise framework for noisy node classification on graphs.
PI-GNN relies on the PI as a primary learning proxy in addition to the pointwise learning from the noisy node class labels.
Our proposed framework PI-GNN contributes two novel components: (1) a confidence-aware PI estimation model that adaptively estimates the PI labels, and (2) a decoupled training approach that leverages the estimated PI labels.
arXiv Detail & Related papers (2021-06-14T14:23:08Z) - When Contrastive Learning Meets Active Learning: A Novel Graph Active
Learning Paradigm with Self-Supervision [19.938379604834743]
This paper studies active learning (AL) on graphs, whose purpose is to discover the most informative nodes to maximize the performance of graph neural networks (GNNs)
Motivated by the success of contrastive learning (CL), we propose a novel paradigm that seamlessly integrates graph AL with CL.
Comprehensive, confounding-free experiments on five public datasets demonstrate the superiority of our method over state-of-the-arts.
arXiv Detail & Related papers (2020-10-30T06:20:07Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.