OpenFGL: A Comprehensive Benchmarks for Federated Graph Learning
- URL: http://arxiv.org/abs/2408.16288v1
- Date: Thu, 29 Aug 2024 06:40:01 GMT
- Title: OpenFGL: A Comprehensive Benchmarks for Federated Graph Learning
- Authors: Xunkai Li, Yinlin Zhu, Boyang Pang, Guochen Yan, Yeyu Yan, Zening Li, Zhengyu Wu, Wentao Zhang, Rong-Hua Li, Guoren Wang,
- Abstract summary: Federated graph learning (FGL) has emerged as a promising distributed training paradigm for graph neural networks across multiple local systems without direct data sharing.
Despite the proliferation of FGL, the diverse motivations from practical applications, spanning various research backgrounds and experimental settings, pose a significant challenge to fair evaluation.
We propose OpenFGL, a unified benchmark designed for the primary FGL scenarios: Graph-FL and Subgraph-FL.
- Score: 36.04858706246336
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated graph learning (FGL) has emerged as a promising distributed training paradigm for graph neural networks across multiple local systems without direct data sharing. This approach is particularly beneficial in privacy-sensitive scenarios and offers a new perspective on addressing scalability challenges in large-scale graph learning. Despite the proliferation of FGL, the diverse motivations from practical applications, spanning various research backgrounds and experimental settings, pose a significant challenge to fair evaluation. To fill this gap, we propose OpenFGL, a unified benchmark designed for the primary FGL scenarios: Graph-FL and Subgraph-FL. Specifically, OpenFGL includes 38 graph datasets from 16 application domains, 8 federated data simulation strategies that emphasize graph properties, and 5 graph-based downstream tasks. Additionally, it offers 18 recently proposed SOTA FGL algorithms through a user-friendly API, enabling a thorough comparison and comprehensive evaluation of their effectiveness, robustness, and efficiency. Empirical results demonstrate the ability of FGL while also revealing its potential limitations, offering valuable insights for future exploration in this thriving field.
Related papers
- SpreadFGL: Edge-Client Collaborative Federated Graph Learning with Adaptive Neighbor Generation [16.599474223790843]
Federated Graph Learning (FGL) has garnered widespread attention by enabling collaborative training on multiple clients for classification tasks.
We propose a novel FGL framework, named SpreadFGL, to promote the information flow in edge-client collaboration.
We show that SpreadFGL achieves higher accuracy and faster convergence against state-of-the-art algorithms.
arXiv Detail & Related papers (2024-07-14T09:34:19Z) - IGL-Bench: Establishing the Comprehensive Benchmark for Imbalanced Graph Learning [47.34876616533362]
IGL-Bench is a comprehensive benchmark for imbalanced graph learning.
It investigates state-of-the-art IGL algorithms in terms of effectiveness, robustness, and efficiency on node-level and graph-level tasks.
arXiv Detail & Related papers (2024-06-14T09:30:18Z) - FedGTA: Topology-aware Averaging for Federated Graph Learning [44.11777886421429]
Federated Graph Learning (FGL) is a distributed machine learning paradigm that enables collaborative training on large-scale subgraphs.
Most FGL optimization strategies ignore graph structure, presenting dissatisfied performance and slow convergence.
We propose Federated Graph Topology-aware Aggregation (FedGTA), a personalized optimization strategy that optimize through topology-aware local smoothing confidence and mixed neighbor features.
arXiv Detail & Related papers (2024-01-22T08:31:53Z) - Chasing Fairness in Graphs: A GNN Architecture Perspective [73.43111851492593]
We propose textsfFair textsfMessage textsfPassing (FMP) designed within a unified optimization framework for graph neural networks (GNNs)
In FMP, the aggregation is first adopted to utilize neighbors' information and then the bias mitigation step explicitly pushes demographic group node presentation centers together.
Experiments on node classification tasks demonstrate that the proposed FMP outperforms several baselines in terms of fairness and accuracy on three real-world datasets.
arXiv Detail & Related papers (2023-12-19T18:00:15Z) - Learning Strong Graph Neural Networks with Weak Information [64.64996100343602]
We develop a principled approach to the problem of graph learning with weak information (GLWI)
We propose D$2$PT, a dual-channel GNN framework that performs long-range information propagation on the input graph with incomplete structure, but also on a global graph that encodes global semantic similarities.
arXiv Detail & Related papers (2023-05-29T04:51:09Z) - FederatedScope-GNN: Towards a Unified, Comprehensive and Efficient
Package for Federated Graph Learning [65.48760613529033]
Federated graph learning (FGL) has not been well supported due to its unique characteristics and requirements.
We first discuss the challenges in creating an easy-to-use FGL package and accordingly present our implemented package FederatedScope-GNN (FS-G)
We validate the effectiveness of FS-G by conducting extensive experiments, which simultaneously gains many valuable insights about FGL for the community.
arXiv Detail & Related papers (2022-04-12T06:48:06Z) - Federated Graph Learning -- A Position Paper [36.424411232612606]
Federated learning (FL) is an emerging technique that can collaboratively train a shared model while keeping the data decentralized.
We term it as federated graph learning (FGL)
Considering how graph data are distributed among clients, we propose four types of FGL: inter-graph FL, intra-graph FL and graph-structured FL.
arXiv Detail & Related papers (2021-05-24T05:39:24Z) - FedGL: Federated Graph Learning Framework with Global Self-Supervision [22.124339267195822]
FedGL is capable of obtaining a high-quality global graph model while protecting data privacy.
The global self-supervision enables the information of each client to flow and share in a privacy-preserving manner.
arXiv Detail & Related papers (2021-05-07T11:27:23Z) - CogDL: A Comprehensive Library for Graph Deep Learning [55.694091294633054]
We present CogDL, a library for graph deep learning that allows researchers and practitioners to conduct experiments, compare methods, and build applications with ease and efficiency.
In CogDL, we propose a unified design for the training and evaluation of GNN models for various graph tasks, making it unique among existing graph learning libraries.
We develop efficient sparse operators for CogDL, enabling it to become the most competitive graph library for efficiency.
arXiv Detail & Related papers (2021-03-01T12:35:16Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.