FedGKD: Unleashing the Power of Collaboration in Federated Graph Neural
Networks
- URL: http://arxiv.org/abs/2309.09517v3
- Date: Thu, 21 Sep 2023 08:37:22 GMT
- Title: FedGKD: Unleashing the Power of Collaboration in Federated Graph Neural
Networks
- Authors: Qiying Pan, Ruofan Wu, Tengfei Liu, Tianyi Zhang, Yifei Zhu, Weiqiang
Wang
- Abstract summary: Federated training of Graph Neural Networks (GNN) has become popular in recent years due to its ability to perform graph-related tasks under data isolation scenarios.
graph heterogeneity issues in federated GNN systems continue to pose challenges.
We propose FedGKD, a novel federated GNN framework that utilizes a novel client-side graph dataset distillation method.
- Score: 40.5420021584431
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Federated training of Graph Neural Networks (GNN) has become popular in
recent years due to its ability to perform graph-related tasks under data
isolation scenarios while preserving data privacy. However, graph heterogeneity
issues in federated GNN systems continue to pose challenges. Existing
frameworks address the problem by representing local tasks using different
statistics and relating them through a simple aggregation mechanism. However,
these approaches suffer from limited efficiency from two aspects: low quality
of task-relatedness quantification and inefficacy of exploiting the
collaboration structure. To address these issues, we propose FedGKD, a novel
federated GNN framework that utilizes a novel client-side graph dataset
distillation method to extract task features that better describe
task-relatedness, and introduces a novel server-side aggregation mechanism that
is aware of the global collaboration structure. We conduct extensive
experiments on six real-world datasets of different scales, demonstrating our
framework's outperformance.
Related papers
- Unified Graph Networks (UGN): A Deep Neural Framework for Solving Graph Problems [0.5699788926464752]
We propose a novel framework named emphUnified emphGraph emphNetwork (UGN) to solve graph problems.
UGN is based on graph convolutional neural networks (GCN) and 2-dimensional convolutional neural networks (Conv2D)
arXiv Detail & Related papers (2025-02-11T12:03:18Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Federated Continual Graph Learning [7.464095716250756]
We present a pioneering study on Federated Continual Graph Learning (FCGL)
FCGL adapts to multiple evolving graphs within decentralized settings while adhering to storage and privacy constraints.
Our work begins with a comprehensive empirical analysis of FCGL, assessing its data characteristics, feasibility, and effectiveness.
arXiv Detail & Related papers (2024-11-28T05:15:47Z) - TANGNN: a Concise, Scalable and Effective Graph Neural Networks with Top-m Attention Mechanism for Graph Representation Learning [7.879217146851148]
We propose an innovative Graph Neural Network (GNN) architecture that integrates a Top-m attention mechanism aggregation component and a neighborhood aggregation component.
To assess the effectiveness of our proposed model, we have applied it to citation sentiment prediction, a novel task previously unexplored in the GNN field.
arXiv Detail & Related papers (2024-11-23T05:31:25Z) - Federated Temporal Graph Clustering [9.779760673367663]
Temporal graph clustering is a complex task that involves discovering meaningful structures in dynamic graphs where relationships and entities change over time.
Existing methods typically require centralized data collection, which poses significant privacy and communication challenges.
We introduce a novel Federated Temporal Graph Clustering framework that enables decentralized training of graph neural networks (GNNs) across multiple clients.
arXiv Detail & Related papers (2024-10-16T08:04:57Z) - A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - Federated Neural Graph Databases [53.03085605769093]
We propose Federated Neural Graph Database (FedNGDB), a novel framework that enables reasoning over multi-source graph-based data while preserving privacy.
Unlike existing methods, FedNGDB can handle complex graph structures and relationships, making it suitable for various downstream tasks.
arXiv Detail & Related papers (2024-02-22T14:57:44Z) - Over-Squashing in Graph Neural Networks: A Comprehensive survey [0.0]
This survey delves into the challenge of over-squashing in Graph Neural Networks (GNNs)
It comprehensively explores the causes, consequences, and mitigation strategies for over-squashing.
Various methodologies are reviewed, including graph rewiring, novel normalization, spectral analysis, and curvature-based strategies.
The survey also discusses the interplay between over-squashing and other GNN limitations, such as over-smoothing.
arXiv Detail & Related papers (2023-08-29T18:46:15Z) - Graph Transformer GANs for Graph-Constrained House Generation [223.739067413952]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The GTGAN learns effective graph node relations in an end-to-end fashion for the challenging graph-constrained house generation task.
arXiv Detail & Related papers (2023-03-14T20:35:45Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.