Structural Imbalance Aware Graph Augmentation Learning
- URL: http://arxiv.org/abs/2303.13757v1
- Date: Fri, 24 Mar 2023 02:13:32 GMT
- Title: Structural Imbalance Aware Graph Augmentation Learning
- Authors: Zulong Liu, Kejia-Chen, Zheng Liu
- Abstract summary: Graphs are often structurally imbalanced, that is, only a few hub nodes have a denser local structure and higher influence.
This paper proposes a selective graph augmentation method (SAug) to solve this problem.
Extensive experiments demonstrate that SAug can significantly improve the backbone GNNs and achieve superior performance to its competitors.
- Score: 2.793446335600599
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph machine learning (GML) has made great progress in node classification,
link prediction, graph classification and so on. However, graphs in reality are
often structurally imbalanced, that is, only a few hub nodes have a denser
local structure and higher influence. The imbalance may compromise the
robustness of existing GML models, especially in learning tail nodes. This
paper proposes a selective graph augmentation method (SAug) to solve this
problem. Firstly, a Pagerank-based sampling strategy is designed to identify
hub nodes and tail nodes in the graph. Secondly, a selective augmentation
strategy is proposed, which drops the noisy neighbors of hub nodes on one side,
and discovers the latent neighbors and generates pseudo neighbors for tail
nodes on the other side. It can also alleviate the structural imbalance between
two types of nodes. Finally, a GNN model will be retrained on the augmented
graph. Extensive experiments demonstrate that SAug can significantly improve
the backbone GNNs and achieve superior performance to its competitors of graph
augmentation methods and hub/tail aware methods.
Related papers
- SF-GNN: Self Filter for Message Lossless Propagation in Deep Graph Neural Network [38.669815079957566]
Graph Neural Network (GNN) with the main idea of encoding graph structure information of graphs by propagation and aggregation has developed rapidly.
It achieved excellent performance in representation learning of multiple types of graphs such as homogeneous graphs, heterogeneous graphs, and more complex graphs like knowledge graphs.
For the phenomenon of performance degradation in deep GNNs, we propose a new perspective.
arXiv Detail & Related papers (2024-07-03T02:40:39Z) - Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Self-Attention Empowered Graph Convolutional Network for Structure
Learning and Node Embedding [5.164875580197953]
In representation learning on graph-structured data, many popular graph neural networks (GNNs) fail to capture long-range dependencies.
This paper proposes a novel graph learning framework called the graph convolutional network with self-attention (GCN-SA)
The proposed scheme exhibits an exceptional generalization capability in node-level representation learning.
arXiv Detail & Related papers (2024-03-06T05:00:31Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - SAILOR: Structural Augmentation Based Tail Node Representation Learning [49.19653803667422]
Graph Neural Networks (GNNs) have achieved state-of-the-art performance in representation learning for graphs recently.
Most of the graphs in real-world scenarios follow a long-tailed distribution on their node degrees, that is, a vast majority of the nodes in the graph are tail nodes with only a few connected edges.
We propose a general Structural Augmentation based taIL nOde Representation learning framework, dubbed as SAILOR, which can jointly learn to augment the graph structure and extract more informative representations for tail nodes.
arXiv Detail & Related papers (2023-08-13T16:04:03Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Deformable Graph Convolutional Networks [12.857403315970231]
Graph neural networks (GNNs) have significantly improved representation power for graph-structured data.
In this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces.
Our framework simultaneously learns the node positional embeddings to determine the relations between nodes in an end-to-end fashion.
arXiv Detail & Related papers (2021-12-29T07:55:29Z) - Imbalanced Graph Classification via Graph-of-Graph Neural Networks [16.589373163769853]
Graph Neural Networks (GNNs) have achieved unprecedented success in learning graph representations to identify categorical labels of graphs.
We introduce a novel framework, Graph-of-Graph Neural Networks (G$2$GNN), which alleviates the graph imbalance issue by deriving extra supervision globally from neighboring graphs and locally from graphs themselves.
Our proposed G$2$GNN outperforms numerous baselines by roughly 5% in both F1-macro and F1-micro scores.
arXiv Detail & Related papers (2021-12-01T02:25:47Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.