MAG-GNN: Reinforcement Learning Boosted Graph Neural Network
- URL: http://arxiv.org/abs/2310.19142v1
- Date: Sun, 29 Oct 2023 20:32:21 GMT
- Title: MAG-GNN: Reinforcement Learning Boosted Graph Neural Network
- Authors: Lecheng Kong, Jiarui Feng, Hao Liu, Dacheng Tao, Yixin Chen, Muhan
Zhang
- Abstract summary: A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
- Score: 68.60884768323739
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While Graph Neural Networks (GNNs) recently became powerful tools in graph
learning tasks, considerable efforts have been spent on improving GNNs'
structural encoding ability. A particular line of work proposed subgraph GNNs
that use subgraph information to improve GNNs' expressivity and achieved great
success. However, such effectivity sacrifices the efficiency of GNNs by
enumerating all possible subgraphs. In this paper, we analyze the necessity of
complete subgraph enumeration and show that a model can achieve a comparable
level of expressivity by considering a small subset of the subgraphs. We then
formulate the identification of the optimal subset as a combinatorial
optimization problem and propose Magnetic Graph Neural Network (MAG-GNN), a
reinforcement learning (RL) boosted GNN, to solve the problem. Starting with a
candidate subgraph set, MAG-GNN employs an RL agent to iteratively update the
subgraphs to locate the most expressive set for prediction. This reduces the
exponential complexity of subgraph enumeration to the constant complexity of a
subgraph search algorithm while keeping good expressivity. We conduct extensive
experiments on many datasets, showing that MAG-GNN achieves competitive
performance to state-of-the-art methods and even outperforms many subgraph
GNNs. We also demonstrate that MAG-GNN effectively reduces the running time of
subgraph GNNs.
Related papers
- A Flexible, Equivariant Framework for Subgraph GNNs via Graph Products and Graph Coarsening [18.688057947275112]
Subgraph Graph Neural Networks (Subgraph GNNs) enhance the expressivity of message-passing GNNs by representing graphs as sets of subgraphs.
Previous approaches suggested processing only subsets of subgraphs, selected either randomly or via learnable sampling.
This paper introduces a new Subgraph GNNs framework to address these issues.
arXiv Detail & Related papers (2024-06-13T16:29:06Z) - Subgraphormer: Unifying Subgraph GNNs and Graph Transformers via Graph Products [20.332456342247383]
We propose an architecture that integrates Subgraphormer with attention and positional encodings, arguably the most important components in Graph Transformers.
Our results demonstrate significant performance improvements over both Subgraph GNNs and Graph Transformers on a wide range of datasets.
arXiv Detail & Related papers (2024-02-13T13:37:13Z) - An Efficient Subgraph GNN with Provable Substructure Counting Power [32.44487589958533]
We investigate the enhancement of graph neural networks' (GNNs) representation power through their ability in substructure counting.
Recent advances have seen the adoption of subgraph GNNs, which partition an input graph into numerous subgraphs, subsequently applying GNNs to each to augment the graph's overall representation.
Despite their ability to identify various substructures, subgraph GNNs are hindered by significant computational and memory costs.
arXiv Detail & Related papers (2023-03-19T05:35:59Z) - Explainability in subgraphs-enhanced Graph Neural Networks [12.526174412246107]
Subgraphs-enhanced Graph Neural Networks (SGNNs) have been introduced to enhance the expressive power of GNNs.
In this work, we adapt PGExplainer, one of the most recent explainers for GNNs, to SGNNs.
We show that our framework is successful in explaining the decision process of an SGNN on graph classification tasks.
arXiv Detail & Related papers (2022-09-16T13:39:10Z) - Two-level Graph Neural Network [15.014364222532347]
We propose a novel GNN framework, referred to as the Two-level GNN (TL-GNN)
This merges subgraph-level information with node-level information.
Experiments show that TL-GNN outperforms existing GNNs and achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-01-03T02:15:20Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.