TAM: Topology-Aware Margin Loss for Class-Imbalanced Node Classification
- URL: http://arxiv.org/abs/2206.12917v1
- Date: Sun, 26 Jun 2022 16:29:36 GMT
- Title: TAM: Topology-Aware Margin Loss for Class-Imbalanced Node Classification
- Authors: Jaeyun Song, Joonhyung Park, Eunho Yang
- Abstract summary: We propose Topology-Aware Margin (TAM) to reflect local topology on the learning objective.
Our method consistently exhibits superiority over the baselines on various node classification benchmark datasets.
- Score: 33.028354930416754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning unbiased node representations under class-imbalanced graph data is
challenging due to interactions between adjacent nodes. Existing studies have
in common that they compensate the minor class nodes `as a group' according to
their overall quantity (ignoring node connections in graph), which inevitably
increase the false positive cases for major nodes. We hypothesize that the
increase in these false positive cases is highly affected by the label
distribution around each node and confirm it experimentally. In addition, in
order to handle this issue, we propose Topology-Aware Margin (TAM) to reflect
local topology on the learning objective. Our method compares the connectivity
pattern of each node with the class-averaged counter-part and adaptively
adjusts the margin accordingly based on that. Our method consistently exhibits
superiority over the baselines on various node classification benchmark
datasets with representative GNN architectures.
Related papers
- Rethinking Independent Cross-Entropy Loss For Graph-Structured Data [41.92169850308025]
Graph neural networks (GNNs) have exhibited prominent performance in learning graph-structured data.
In this work, we propose a new framework, termed joint-cluster supervised learning, to model the joint distribution of each node with its corresponding cluster.
In this way, the data-label reference signals extracted from the local cluster explicitly strengthen the discrimination ability on the target node.
arXiv Detail & Related papers (2024-05-24T13:52:41Z) - Heterophily-Based Graph Neural Network for Imbalanced Classification [19.51668009720269]
We introduce a unique approach that tackles imbalanced classification on graphs by considering graph heterophily.
We propose Fast Im-GBK, which integrates an imbalance classification strategy with heterophily-aware GNNs.
Our experiments on real-world graphs demonstrate our model's superiority in classification performance and efficiency for node classification tasks.
arXiv Detail & Related papers (2023-10-12T21:19:47Z) - When Do Graph Neural Networks Help with Node Classification?
Investigating the Impact of Homophily Principle on Node Distinguishability [92.8279562472538]
Homophily principle has been believed to be the main reason for the performance superiority of Graph Networks (GNNs) over Neural Networks on node classification tasks.
Recent research suggests that, even in the absence of homophily, the advantage of GNNs still exists as long as nodes from the same class share similar neighborhood patterns.
arXiv Detail & Related papers (2023-04-25T09:40:47Z) - UNREAL:Unlabeled Nodes Retrieval and Labeling for Heavily-imbalanced
Node Classification [17.23736166919287]
skewed label distributions are common in real-world node classification tasks.
In this paper, we propose UNREAL, an iterative over-sampling method.
arXiv Detail & Related papers (2023-03-18T09:23:13Z) - Pseudo Contrastive Learning for Graph-based Semi-supervised Learning [67.37572762925836]
Pseudo Labeling is a technique used to improve the performance of Graph Neural Networks (GNNs)
We propose a general framework for GNNs, termed Pseudo Contrastive Learning (PCL)
arXiv Detail & Related papers (2023-02-19T10:34:08Z) - Simplifying Node Classification on Heterophilous Graphs with Compatible
Label Propagation [6.071760028190454]
We show that a well-known graph algorithm, Label Propagation, combined with a shallow neural network can achieve comparable performance to GNNs in semi-supervised node classification on graphs with high homophily.
In this paper, we show that this approach falls short on graphs with low homophily, where nodes often connect to the nodes of the opposite classes.
Our algorithm first learns the class compatibility matrix and then aggregates label predictions using LP algorithm weighted by class compatibilities.
arXiv Detail & Related papers (2022-05-19T08:34:34Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Unifying Graph Convolutional Neural Networks and Label Propagation [73.82013612939507]
We study the relationship between LPA and GCN in terms of two aspects: feature/label smoothing and feature/label influence.
Based on our theoretical analysis, we propose an end-to-end model that unifies GCN and LPA for node classification.
Our model can also be seen as learning attention weights based on node labels, which is more task-oriented than existing feature-based attention models.
arXiv Detail & Related papers (2020-02-17T03:23:13Z) - Graph Inference Learning for Semi-supervised Classification [50.55765399527556]
We propose a Graph Inference Learning framework to boost the performance of semi-supervised node classification.
For learning the inference process, we introduce meta-optimization on structure relations from training nodes to validation nodes.
Comprehensive evaluations on four benchmark datasets demonstrate the superiority of our proposed GIL when compared against state-of-the-art methods.
arXiv Detail & Related papers (2020-01-17T02:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.