APPFLx: Providing Privacy-Preserving Cross-Silo Federated Learning as a
Service
- URL: http://arxiv.org/abs/2308.08786v1
- Date: Thu, 17 Aug 2023 05:15:47 GMT
- Title: APPFLx: Providing Privacy-Preserving Cross-Silo Federated Learning as a
Service
- Authors: Zilinghan Li, Shilan He, Pranshu Chaturvedi, Trung-Hieu Hoang, Minseok
Ryu, E. A. Huerta, Volodymyr Kindratenko, Jordan Fuhrman, Maryellen Giger,
Ryan Chard, Kibaek Kim, Ravi Madduri
- Abstract summary: Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to collaboratively train robust and generalized machine learning (ML) models without sharing sensitive local data.
APPFLx is a ready-to-use platform that provides privacy-preserving cross-silo federated learning as a service.
- Score: 1.5070429249282935
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to
collaboratively train robust and generalized machine learning (ML) models
without sharing sensitive (e.g., healthcare of financial) local data. To ease
and accelerate the adoption of PPFL, we introduce APPFLx, a ready-to-use
platform that provides privacy-preserving cross-silo federated learning as a
service. APPFLx employs Globus authentication to allow users to easily and
securely invite trustworthy collaborators for PPFL, implements several
synchronous and asynchronous FL algorithms, streamlines the FL experiment
launch process, and enables tracking and visualizing the life cycle of FL
experiments, allowing domain experts and ML practitioners to easily orchestrate
and evaluate cross-silo FL under one platform. APPFLx is available online at
https://appflx.link
Related papers
- EdgeFL: A Lightweight Decentralized Federated Learning Framework [8.934690279361286]
We introduce EdgeFL, an edge-only lightweight decentralized FL framework.
By adopting an edge-only model training and aggregation approach, EdgeFL eliminates the need for a central server.
We show that EdgeFL achieves superior performance compared to existing FL platforms/frameworks.
arXiv Detail & Related papers (2023-09-06T11:55:41Z) - Bayesian Federated Learning: A Survey [54.40136267717288]
Federated learning (FL) demonstrates its advantages in integrating distributed infrastructure, communication, computing and learning in a privacy-preserving manner.
The robustness and capabilities of existing FL methods are challenged by limited and dynamic data and conditions.
BFL has emerged as a promising approach to address these issues.
arXiv Detail & Related papers (2023-04-26T03:41:17Z) - Collaborating Heterogeneous Natural Language Processing Tasks via
Federated Learning [55.99444047920231]
The proposed ATC framework achieves significant improvements compared with various baseline methods.
We conduct extensive experiments on six widely-used datasets covering both Natural Language Understanding (NLU) and Natural Language Generation (NLG) tasks.
arXiv Detail & Related papers (2022-12-12T09:27:50Z) - TorchFL: A Performant Library for Bootstrapping Federated Learning
Experiments [4.075095403704456]
We introduce TorchFL, a performant library for bootstrapping federated learning experiments.
TorchFL is built on a bottom-up design using PyTorch and Lightning.
Being built on a bottom-up design using PyTorch and Lightning, TorchFL provides ready-to-use abstractions for models, datasets, and FL algorithms.
arXiv Detail & Related papers (2022-11-01T20:31:55Z) - FederatedScope: A Comprehensive and Flexible Federated Learning Platform
via Message Passing [63.87056362712879]
We propose a novel and comprehensive federated learning platform, named FederatedScope, which is based on a message-oriented framework.
Compared to the procedural framework, the proposed message-oriented framework is more flexible to express heterogeneous message exchange.
We conduct a series of experiments on the provided easy-to-use and comprehensive FL benchmarks to validate the correctness and efficiency of FederatedScope.
arXiv Detail & Related papers (2022-04-11T11:24:21Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z) - FedLab: A Flexible Federated Learning Framework [16.481399535233717]
Federated learning (FL) is a solution for privacy challenge, which allows multiparty to train a shared model without violating privacy protection regulations.
To help researchers verify their ideas in FL, we designed and developed FedLab, a flexible and modular FL framework based on PyTorch.
arXiv Detail & Related papers (2021-07-24T14:34:02Z) - EasyFL: A Low-code Federated Learning Platform For Dummies [21.984721627569783]
We propose the first low-code Federated Learning (FL) platform, EasyFL, to enable users with various levels of expertise to experiment and prototype FL applications with little coding.
With only a few lines of code, EasyFL empowers them with many out-of-the-box functionalities to accelerate experimentation and deployment.
Our implementations show that EasyFL requires only three lines of code to build a vanilla FL application, at least 10x lesser than other platforms.
arXiv Detail & Related papers (2021-05-17T04:15:55Z) - OpenFL: An open-source framework for Federated Learning [41.03632020180591]
Federated learning (FL) is a computational paradigm that enables organizations to collaborate on machine learning (ML) projects without sharing sensitive data.
OpenFL is an open-source framework for training ML algorithms using the data-private collaborative learning paradigm of FL.
arXiv Detail & Related papers (2021-05-13T16:40:19Z) - FedNLP: A Research Platform for Federated Learning in Natural Language
Processing [55.01246123092445]
We present the FedNLP, a research platform for federated learning in NLP.
FedNLP supports various popular task formulations in NLP such as text classification, sequence tagging, question answering, seq2seq generation, and language modeling.
Preliminary experiments with FedNLP reveal that there exists a large performance gap between learning on decentralized and centralized datasets.
arXiv Detail & Related papers (2021-04-18T11:04:49Z) - FedML: A Research Library and Benchmark for Federated Machine Learning [55.09054608875831]
Federated learning (FL) is a rapidly growing research field in machine learning.
Existing FL libraries cannot adequately support diverse algorithmic development.
We introduce FedML, an open research library and benchmark to facilitate FL algorithm development and fair performance comparison.
arXiv Detail & Related papers (2020-07-27T13:02:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.