EasyFL: A Low-code Federated Learning Platform For Dummies
- URL: http://arxiv.org/abs/2105.07603v1
- Date: Mon, 17 May 2021 04:15:55 GMT
- Title: EasyFL: A Low-code Federated Learning Platform For Dummies
- Authors: Weiming Zhuang, Xin Gan, Yonggang Wen, Shuai Zhang
- Abstract summary: We propose the first low-code Federated Learning (FL) platform, EasyFL, to enable users with various levels of expertise to experiment and prototype FL applications with little coding.
With only a few lines of code, EasyFL empowers them with many out-of-the-box functionalities to accelerate experimentation and deployment.
Our implementations show that EasyFL requires only three lines of code to build a vanilla FL application, at least 10x lesser than other platforms.
- Score: 21.984721627569783
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Academia and industry have developed several platforms to support the popular
privacy-preserving distributed learning method -- Federated Learning (FL).
However, these platforms are complex to use and require a deep understanding of
FL, which imposes high barriers to entry for beginners, limits the productivity
of data scientists, and compromises deployment efficiency. In this paper, we
propose the first low-code FL platform, EasyFL, to enable users with various
levels of expertise to experiment and prototype FL applications with little
coding. We achieve this goal while ensuring great flexibility for customization
by unifying simple API design, modular design, and granular training flow
abstraction. With only a few lines of code, EasyFL empowers them with many
out-of-the-box functionalities to accelerate experimentation and deployment.
These practical functionalities are heterogeneity simulation, distributed
training optimization, comprehensive tracking, and seamless deployment. They
are proposed based on challenges identified in the proposed FL life cycle. Our
implementations show that EasyFL requires only three lines of code to build a
vanilla FL application, at least 10x lesser than other platforms. Besides, our
evaluations demonstrate that EasyFL expedites training by 1.5x. It also
improves the efficiency of experiments and deployment. We believe that EasyFL
will increase the productivity of data scientists and democratize FL to wider
audiences.
Related papers
- A Survey on Efficient Federated Learning Methods for Foundation Model Training [62.473245910234304]
Federated Learning (FL) has become an established technique to facilitate privacy-preserving collaborative training across a multitude of clients.
In the wake of Foundation Models (FM), the reality is different for many deep learning applications.
We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications.
arXiv Detail & Related papers (2024-01-09T10:22:23Z) - EdgeFL: A Lightweight Decentralized Federated Learning Framework [8.934690279361286]
We introduce EdgeFL, an edge-only lightweight decentralized FL framework.
By adopting an edge-only model training and aggregation approach, EdgeFL eliminates the need for a central server.
We show that EdgeFL achieves superior performance compared to existing FL platforms/frameworks.
arXiv Detail & Related papers (2023-09-06T11:55:41Z) - APPFLx: Providing Privacy-Preserving Cross-Silo Federated Learning as a
Service [1.5070429249282935]
Cross-silo privacy-preserving federated learning (PPFL) is a powerful tool to collaboratively train robust and generalized machine learning (ML) models without sharing sensitive local data.
APPFLx is a ready-to-use platform that provides privacy-preserving cross-silo federated learning as a service.
arXiv Detail & Related papers (2023-08-17T05:15:47Z) - FLGo: A Fully Customizable Federated Learning Platform [23.09038374160798]
We propose a novel lightweight Federated learning platform called FLGo.
Our platform offers 40+ benchmarks, 20+ algorithms, and 2 system simulators as out-of-the-box plugins.
We also develop a range of experimental tools, including parallel acceleration, experiment tracker and parameters auto-tuning.
arXiv Detail & Related papers (2023-06-21T07:55:29Z) - Vertical Semi-Federated Learning for Efficient Online Advertising [50.18284051956359]
Semi-VFL (Vertical Semi-Federated Learning) is proposed to achieve a practical industry application fashion for VFL.
We build an inference-efficient single-party student model applicable to the whole sample space.
New representation distillation methods are designed to extract cross-party feature correlations for both the overlapped and non-overlapped data.
arXiv Detail & Related papers (2022-09-30T17:59:27Z) - UniFed: All-In-One Federated Learning Platform to Unify Open-Source
Frameworks [53.20176108643942]
We present UniFed, the first unified platform for standardizing open-source Federated Learning (FL) frameworks.
UniFed streamlines the end-to-end workflow for distributed experimentation and deployment, encompassing 11 popular open-source FL frameworks.
We evaluate and compare 11 popular FL frameworks from the perspectives of functionality, privacy protection, and performance.
arXiv Detail & Related papers (2022-07-21T05:03:04Z) - pFL-Bench: A Comprehensive Benchmark for Personalized Federated Learning [42.819532536636835]
We propose the first comprehensive pFL benchmark, pFL-Bench, for rapid, reproducible, standardized and thorough pFL evaluation.
The proposed benchmark contains more than 10 datasets in diverse application domains with unified data partition and realistic heterogeneous settings.
We highlight the benefits and potential of state-of-the-art pFL methods and hope pFL-Bench enables further pFL research and broad applications.
arXiv Detail & Related papers (2022-06-08T02:51:59Z) - FederatedScope: A Comprehensive and Flexible Federated Learning Platform
via Message Passing [63.87056362712879]
We propose a novel and comprehensive federated learning platform, named FederatedScope, which is based on a message-oriented framework.
Compared to the procedural framework, the proposed message-oriented framework is more flexible to express heterogeneous message exchange.
We conduct a series of experiments on the provided easy-to-use and comprehensive FL benchmarks to validate the correctness and efficiency of FederatedScope.
arXiv Detail & Related papers (2022-04-11T11:24:21Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - Flower: A Friendly Federated Learning Research Framework [18.54638343801354]
Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared prediction model.
We present Flower -- a comprehensive FL framework that distinguishes itself from existing platforms by offering new facilities to execute large-scale FL experiments.
arXiv Detail & Related papers (2020-07-28T17:59:07Z) - FedML: A Research Library and Benchmark for Federated Machine Learning [55.09054608875831]
Federated learning (FL) is a rapidly growing research field in machine learning.
Existing FL libraries cannot adequately support diverse algorithmic development.
We introduce FedML, an open research library and benchmark to facilitate FL algorithm development and fair performance comparison.
arXiv Detail & Related papers (2020-07-27T13:02:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.