FedGraphNN: A Federated Learning System and Benchmark for Graph Neural
Networks
- URL: http://arxiv.org/abs/2104.07145v1
- Date: Wed, 14 Apr 2021 22:11:35 GMT
- Title: FedGraphNN: A Federated Learning System and Benchmark for Graph Neural
Networks
- Authors: Chaoyang He, Keshav Balasubramanian, Emir Ceyani, Yu Rong, Peilin
Zhao, Junzhou Huang, Murali Annavaram, Salman Avestimehr
- Abstract summary: Graph Neural Network (GNN) research is rapidly growing thanks to the capacity of GNNs to learn representations from graph-structured data.
Centralizing a massive amount of real-world graph data for GNN training is prohibitive due to user-side privacy concerns.
We introduce FedGraphNN, an open research federated learning system and a benchmark to facilitate GNN-based FL research.
- Score: 68.64678614325193
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Network (GNN) research is rapidly growing thanks to the capacity
of GNNs to learn representations from graph-structured data. However,
centralizing a massive amount of real-world graph data for GNN training is
prohibitive due to user-side privacy concerns, regulation restrictions, and
commercial competition. Federated learning (FL), a trending distributed
learning paradigm, aims to solve this challenge while preserving privacy.
Despite recent advances in vision and language domains, there is no suitable
platform for the federated training of GNNs. To this end, we introduce
FedGraphNN, an open research federated learning system and a benchmark to
facilitate GNN-based FL research. FedGraphNN is built on a unified formulation
of federated GNNs and supports commonly used datasets, GNN models, FL
algorithms, and flexible APIs. We also contribute a new molecular dataset,
hERG, to promote research exploration. Our experimental results present
significant challenges in federated GNN training: federated GNNs perform worse
in most datasets with a non-I.I.D split than centralized GNNs; the GNN model
that attains the best result in the centralized setting may not hold its
advantage in the federated setting. These results imply that more research
efforts are needed to unravel the mystery behind federated GNN training.
Moreover, our system performance analysis demonstrates that the FedGraphNN
system is computationally affordable to most research labs with limited GPUs.
We maintain the source code at https://github.com/FedML-AI/FedGraphNN.
Related papers
- Information Flow in Graph Neural Networks: A Clinical Triage Use Case [49.86931948849343]
Graph Neural Networks (GNNs) have gained popularity in healthcare and other domains due to their ability to process multi-modal and multi-relational graphs.
We investigate how the flow of embedding information within GNNs affects the prediction of links in Knowledge Graphs (KGs)
Our results demonstrate that incorporating domain knowledge into the GNN connectivity leads to better performance than using the same connectivity as the KG or allowing unconstrained embedding propagation.
arXiv Detail & Related papers (2023-09-12T09:18:12Z) - FedHGN: A Federated Framework for Heterogeneous Graph Neural Networks [45.94642721490744]
Heterogeneous graph neural networks (HGNNs) can learn from typed and relational graph data more effectively than conventional GNNs.
With larger parameter spaces, HGNNs may require more training data, which is often scarce in real-world applications due to privacy regulations.
We propose FedHGN, a novel and general FGL framework for HGNNs.
arXiv Detail & Related papers (2023-05-16T18:01:49Z) - Distributed Graph Neural Network Training: A Survey [51.77035975191926]
Graph neural networks (GNNs) are a type of deep learning models that are trained on graphs and have been successfully applied in various domains.
Despite the effectiveness of GNNs, it is still challenging for GNNs to efficiently scale to large graphs.
As a remedy, distributed computing becomes a promising solution of training large-scale GNNs.
arXiv Detail & Related papers (2022-11-01T01:57:00Z) - Federated Graph Neural Networks: Overview, Techniques and Challenges [16.62839758251491]
Graph neural networks (GNNs) have received significant research attention.
As societies become increasingly concerned with data privacy, GNNs face the need to adapt to this new normal.
This has led to the rapid development of federated graph neural networks (FedGNNs) research in recent years.
arXiv Detail & Related papers (2022-02-15T09:05:35Z) - Wide and Deep Graph Neural Network with Distributed Online Learning [174.8221510182559]
Graph neural networks (GNNs) are naturally distributed architectures for learning representations from network data.
Online learning can be leveraged to retrain GNNs at testing time to overcome this issue.
This paper develops the Wide and Deep GNN (WD-GNN), a novel architecture that can be updated with distributed online learning mechanisms.
arXiv Detail & Related papers (2021-07-19T23:56:48Z) - SpreadGNN: Serverless Multi-task Federated Learning for Graph Neural
Networks [13.965982814292971]
Graph Neural Networks (GNNs) are the first choice methods for graph machine learning problems.
Centralizing a massive amount of real-world graph data for GNN training is prohibitive due to user-side privacy concerns.
This work proposes SpreadGNN, a novel multi-task federated training framework.
arXiv Detail & Related papers (2021-06-04T22:20:47Z) - A Unified Lottery Ticket Hypothesis for Graph Neural Networks [82.31087406264437]
We present a unified GNN sparsification (UGS) framework that simultaneously prunes the graph adjacency matrix and the model weights.
We further generalize the popular lottery ticket hypothesis to GNNs for the first time, by defining a graph lottery ticket (GLT) as a pair of core sub-dataset and sparse sub-network.
arXiv Detail & Related papers (2021-02-12T21:52:43Z) - Wide and Deep Graph Neural Networks with Distributed Online Learning [175.96910854433574]
Graph neural networks (GNNs) learn representations from network data with naturally distributed architectures.
Online learning can be used to retrain GNNs at testing time, overcoming this issue.
This paper proposes the Wide and Deep GNN (WD-GNN), a novel architecture that can be easily updated with distributed online learning mechanisms.
arXiv Detail & Related papers (2020-06-11T12:48:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.