GraphRARE: Reinforcement Learning Enhanced Graph Neural Network with Relative Entropy
- URL: http://arxiv.org/abs/2312.09708v2
- Date: Sat, 13 Apr 2024 08:52:55 GMT
- Title: GraphRARE: Reinforcement Learning Enhanced Graph Neural Network with Relative Entropy
- Authors: Tianhao Peng, Wenjun Wu, Haitao Yuan, Zhifeng Bao, Zhao Pengrui, Xin Yu, Xuetao Lin, Yu Liang, Yanjun Pu,
- Abstract summary: GraphRARE is a framework built upon node relative entropy and deep reinforcement learning.
An innovative node relative entropy is used to measure mutual information between node pairs.
A deep reinforcement learning-based algorithm is developed to optimize the graph topology.
- Score: 21.553180564868306
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have shown advantages in graph-based analysis tasks. However, most existing methods have the homogeneity assumption and show poor performance on heterophilic graphs, where the linked nodes have dissimilar features and different class labels, and the semantically related nodes might be multi-hop away. To address this limitation, this paper presents GraphRARE, a general framework built upon node relative entropy and deep reinforcement learning, to strengthen the expressive capability of GNNs. An innovative node relative entropy, which considers node features and structural similarity, is used to measure mutual information between node pairs. In addition, to avoid the sub-optimal solutions caused by mixing useful information and noises of remote nodes, a deep reinforcement learning-based algorithm is developed to optimize the graph topology. This algorithm selects informative nodes and discards noisy nodes based on the defined node relative entropy. Extensive experiments are conducted on seven real-world datasets. The experimental results demonstrate the superiority of GraphRARE in node classification and its capability to optimize the original graph topology.
Related papers
- Self-Attention Empowered Graph Convolutional Network for Structure
Learning and Node Embedding [5.164875580197953]
In representation learning on graph-structured data, many popular graph neural networks (GNNs) fail to capture long-range dependencies.
This paper proposes a novel graph learning framework called the graph convolutional network with self-attention (GCN-SA)
The proposed scheme exhibits an exceptional generalization capability in node-level representation learning.
arXiv Detail & Related papers (2024-03-06T05:00:31Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Inferential SIR-GN: Scalable Graph Representation Learning [0.4699313647907615]
Graph representation learning methods generate numerical vector representations for the nodes in a network.
In this work, we propose Inferential SIR-GN, a model which is pre-trained on random graphs, then computes node representations rapidly.
We demonstrate that the model is able to capture node's structural role information, and show excellent performance at node and graph classification tasks, on unseen networks.
arXiv Detail & Related papers (2021-11-08T20:56:37Z) - Reasoning Graph Networks for Kinship Verification: from Star-shaped to
Hierarchical [85.0376670244522]
We investigate the problem of facial kinship verification by learning hierarchical reasoning graph networks.
We develop a Star-shaped Reasoning Graph Network (S-RGN) to exploit more powerful and flexible capacity.
We also develop a Hierarchical Reasoning Graph Network (H-RGN) to exploit more powerful and flexible capacity.
arXiv Detail & Related papers (2021-09-06T03:16:56Z) - Graph Entropy Guided Node Embedding Dimension Selection for Graph Neural
Networks [74.26734952400925]
We propose a novel Minimum Graph Entropy (MinGE) algorithm for Node Embedding Dimension Selection (NEDS)
MinGE considers both feature entropy and structure entropy on graphs, which are carefully designed according to the characteristics of the rich information in them.
Experiments with popular Graph Neural Networks (GNNs) on benchmark datasets demonstrate the effectiveness and generalizability of our proposed MinGE.
arXiv Detail & Related papers (2021-05-07T11:40:29Z) - Schema-Aware Deep Graph Convolutional Networks for Heterogeneous Graphs [10.526065883783899]
Graph convolutional network (GCN) based approaches have achieved significant progress for solving complex, graph-structured problems.
We propose our GCN framework 'Deep Heterogeneous Graph Convolutional Network (DHGCN)'
It takes advantage of the schema of a heterogeneous graph and uses a hierarchical approach to effectively utilize information many hops away.
arXiv Detail & Related papers (2021-05-03T06:24:27Z) - Node2Seq: Towards Trainable Convolutions in Graph Neural Networks [59.378148590027735]
We propose a graph network layer, known as Node2Seq, to learn node embeddings with explicitly trainable weights for different neighboring nodes.
For a target node, our method sorts its neighboring nodes via attention mechanism and then employs 1D convolutional neural networks (CNNs) to enable explicit weights for information aggregation.
In addition, we propose to incorporate non-local information for feature learning in an adaptive manner based on the attention scores.
arXiv Detail & Related papers (2021-01-06T03:05:37Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.