Improving Signed Propagation for Graph Neural Networks in Multi-Class Environments
- URL: http://arxiv.org/abs/2301.08918v7
- Date: Mon, 30 Sep 2024 13:58:53 GMT
- Title: Improving Signed Propagation for Graph Neural Networks in Multi-Class Environments
- Authors: Yoonhyuk Choi, Jiho Choi, Taewook Ko, Chong-Kwon Kim,
- Abstract summary: We introduce two novel strategies for improving signed propagation under multi-class graphs.
The proposed scheme combines calibration to secure robustness while reducing uncertainty.
We show the efficacy of our theorem through extensive experiments on six benchmark graph datasets.
- Score: 3.4498722449655066
- License:
- Abstract: Message-passing Graph Neural Networks (GNNs), which collect information from adjacent nodes achieve dismal performance on heterophilic graphs. Various schemes have been proposed to solve this problem, and propagating signed information on heterophilic edges has gained great attention. Recently, some works provided theoretical analysis that signed propagation always leads to performance improvement under a binary class scenario. However, we notice that prior analyses do not align well with multi-class benchmark datasets. This paper provides a new understanding of signed propagation for multi-class scenarios and points out two drawbacks in terms of message-passing and parameter update: (1) Message-passing: if two nodes belong to different classes but have a high similarity, signed propagation can decrease the separability. (2) Parameter update: the prediction uncertainty (e.g., conflict evidence) of signed neighbors increases during training, which can impede the stability of the algorithm. Based on the observation, we introduce two novel strategies for improving signed propagation under multi-class graphs. The proposed scheme combines calibration to secure robustness while reducing uncertainty. We show the efficacy of our theorem through extensive experiments on six benchmark graph datasets.
Related papers
- Self-Supervised Conditional Distribution Learning on Graphs [15.730933577970687]
We present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
This alignment effectively reduces the risk of disrupting intrinsic semantic information through graph-structured data augmentation.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - Virtual Node Generation for Node Classification in Sparsely-Labeled Graphs [2.0060301665996016]
This paper presents a novel node generation method that infuses a small set of high-quality synthesized nodes into the graph as additional labeled nodes.
It is compatible with most popular graph pre-training (self-supervised learning), semi-supervised learning, and meta-learning methods.
Our Experiments demonstrate statistically significant performance improvements over 14 baselines on 10 publicly available datasets.
arXiv Detail & Related papers (2024-09-12T02:36:44Z) - Better Not to Propagate: Understanding Edge Uncertainty and Over-smoothing in Signed Graph Neural Networks [3.4498722449655066]
We propose a novel method for estimating homophily and edge error ratio, integrated with dynamic selection between blocked and signed propagation during training.
Our theoretical analysis, supported by extensive experiments, demonstrates that blocking MP can be more effective than signed propagation under high edge error ratios.
arXiv Detail & Related papers (2024-08-09T06:46:06Z) - ALEX: Towards Effective Graph Transfer Learning with Noisy Labels [11.115297917940829]
We introduce a novel technique termed Balance Alignment and Information-aware Examination (ALEX) to address the problem of graph transfer learning.
ALEX first employs singular value decomposition to generate different views with crucial structural semantics, which help provide robust node representations.
Building on this foundation, an adversarial domain discriminator is incorporated for the implicit domain alignment of complex multi-modal distributions.
arXiv Detail & Related papers (2023-09-26T04:59:49Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Interpretable Signed Link Prediction with Signed Infomax Hyperbolic
Graph [54.03786611989613]
signed link prediction in social networks aims to reveal the underlying relationships (i.e. links) among users (i.e. nodes)
We develop a unified framework, termed as Signed Infomax Hyperbolic Graph (textbfSIHG)
In order to model high-order user relations and complex hierarchies, the node embeddings are projected and measured in a hyperbolic space with a lower distortion.
arXiv Detail & Related papers (2020-11-25T05:09:03Z) - Graph Contrastive Learning with Adaptive Augmentation [23.37786673825192]
We propose a novel graph contrastive representation learning method with adaptive augmentation.
Specifically, we design augmentation schemes based on node centrality measures to highlight important connective structures.
Our proposed method consistently outperforms existing state-of-the-art baselines and even surpasses some supervised counterparts.
arXiv Detail & Related papers (2020-10-27T15:12:21Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.