AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange
- URL: http://arxiv.org/abs/2106.05455v3
- Date: Wed, 4 Oct 2023 10:50:38 GMT
- Title: AKE-GNN: Effective Graph Learning with Adaptive Knowledge Exchange
- Authors: Liang Zeng, Jin Xu, Zijun Yao, Yanqiao Zhu, Jian Li
- Abstract summary: Graph Neural Networks (GNNs) have already been widely used in various graph mining tasks.
Recent works reveal that the learned weights (channels) in well-trained GNNs are highly redundant, which limits the performance of GNNs.
We introduce a novel GNN learning framework named AKE-GNN, which performs the Adaptive Knowledge Exchange strategy.
- Score: 14.919474099848816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have already been widely used in various graph
mining tasks. However, recent works reveal that the learned weights (channels)
in well-trained GNNs are highly redundant, which inevitably limits the
performance of GNNs. Instead of removing these redundant channels for
efficiency consideration, we aim to reactivate them to enlarge the
representation capacity of GNNs for effective graph learning. In this paper, we
propose to substitute these redundant channels with other informative channels
to achieve this goal. We introduce a novel GNN learning framework named
AKE-GNN, which performs the Adaptive Knowledge Exchange strategy among multiple
graph views generated by graph augmentations. AKE-GNN first trains multiple
GNNs each corresponding to one graph view to obtain informative channels. Then,
AKE-GNN iteratively exchanges redundant channels in the weight parameter matrix
of one GNN with informative channels of another GNN in a layer-wise manner.
Additionally, existing GNNs can be seamlessly incorporated into our framework.
AKE-GNN achieves superior performance compared with various baselines across a
suite of experiments on node classification, link prediction, and graph
classification. In particular, we conduct a series of experiments on 15 public
benchmark datasets, 8 popular GNN models, and 3 graph tasks and show that
AKE-GNN consistently outperforms existing popular GNN models and even their
ensembles. Extensive ablation studies and analyses on knowledge exchange
methods validate the effectiveness of AKE-GNN.
Related papers
- Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - GNN-Ensemble: Towards Random Decision Graph Neural Networks [3.7620848582312405]
Graph Neural Networks (GNNs) have enjoyed wide spread applications in graph-structured data.
GNNs are required to learn latent patterns from a limited amount of training data to perform inferences on a vast amount of test data.
In this paper, we push one step forward on the ensemble learning of GNNs with improved accuracy, robustness, and adversarial attacks.
arXiv Detail & Related papers (2023-03-20T18:24:01Z) - Distributed Graph Neural Network Training: A Survey [51.77035975191926]
Graph neural networks (GNNs) are a type of deep learning models that are trained on graphs and have been successfully applied in various domains.
Despite the effectiveness of GNNs, it is still challenging for GNNs to efficiently scale to large graphs.
As a remedy, distributed computing becomes a promising solution of training large-scale GNNs.
arXiv Detail & Related papers (2022-11-01T01:57:00Z) - KerGNNs: Interpretable Graph Neural Networks with Graph Kernels [14.421535610157093]
Graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks.
We propose a novel GNN framework, termed textit Kernel Graph Neural Networks (KerGNNs)
KerGNNs integrate graph kernels into the message passing process of GNNs.
We show that our method achieves competitive performance compared with existing state-of-the-art methods.
arXiv Detail & Related papers (2022-01-03T06:16:30Z) - AdaGNN: A multi-modal latent representation meta-learner for GNNs based
on AdaBoosting [0.38073142980733]
Graph Neural Networks (GNNs) focus on extracting intrinsic network features.
We propose boosting-based meta learner for GNNs.
AdaGNN performs exceptionally well for applications with rich and diverse node neighborhood information.
arXiv Detail & Related papers (2021-08-14T03:07:26Z) - Optimization of Graph Neural Networks: Implicit Acceleration by Skip
Connections and More Depth [57.10183643449905]
Graph Neural Networks (GNNs) have been studied from the lens of expressive power and generalization.
We study the dynamics of GNNs by studying deep skip optimization.
Our results provide first theoretical support for the success of GNNs.
arXiv Detail & Related papers (2021-05-10T17:59:01Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - GPT-GNN: Generative Pre-Training of Graph Neural Networks [93.35945182085948]
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data.
We present the GPT-GNN framework to initialize GNNs by generative pre-training.
We show that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
arXiv Detail & Related papers (2020-06-27T20:12:33Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z) - Self-Enhanced GNN: Improving Graph Neural Networks Using Model Outputs [20.197085398581397]
Graph neural networks (GNNs) have received much attention recently because of their excellent performance on graph-based tasks.
We propose self-enhanced GNN (SEG), which improves the quality of the input data using the outputs of existing GNN models.
SEG consistently improves the performance of well-known GNN models such as GCN, GAT and SGC across different datasets.
arXiv Detail & Related papers (2020-02-18T12:27:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.