UniFed: All-In-One Federated Learning Platform to Unify Open-Source
Frameworks
- URL: http://arxiv.org/abs/2207.10308v3
- Date: Sat, 30 Dec 2023 17:33:17 GMT
- Title: UniFed: All-In-One Federated Learning Platform to Unify Open-Source
Frameworks
- Authors: Xiaoyuan Liu, Tianneng Shi, Chulin Xie, Qinbin Li, Kangping Hu, Haoyu
Kim, Xiaojun Xu, The-Anh Vu-Le, Zhen Huang, Arash Nourian, Bo Li, Dawn Song
- Abstract summary: We present UniFed, the first unified platform for standardizing open-source Federated Learning (FL) frameworks.
UniFed streamlines the end-to-end workflow for distributed experimentation and deployment, encompassing 11 popular open-source FL frameworks.
We evaluate and compare 11 popular FL frameworks from the perspectives of functionality, privacy protection, and performance.
- Score: 53.20176108643942
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) has become a practical and widely adopted distributed
learning paradigm. However, the lack of a comprehensive and standardized
solution covering diverse use cases makes it challenging to use in practice. In
addition, selecting an appropriate FL framework for a specific use case can be
a daunting task. In this work, we present UniFed, the first unified platform
for standardizing existing open-source FL frameworks. The platform streamlines
the end-to-end workflow for distributed experimentation and deployment,
encompassing 11 popular open-source FL frameworks. In particular, to address
the substantial variations in workflows and data formats, UniFed introduces a
configuration-based schema-enforced task specification, offering 20 editable
fields. UniFed also provides functionalities such as distributed execution
management, logging, and data analysis.
With UniFed, we evaluate and compare 11 popular FL frameworks from the
perspectives of functionality, privacy protection, and performance, through
conducting developer surveys and code-level investigation. We collect 15
diverse FL scenario setups (e.g., horizontal and vertical settings) for FL
framework evaluation. This comprehensive evaluation allows us to analyze both
model and system performance, providing detailed comparisons and offering
recommendations for framework selection. UniFed simplifies the process of
selecting and utilizing the appropriate FL framework for specific use cases,
while enabling standardized distributed experimentation and deployment. Our
results and analysis based on experiments with up to 178 distributed nodes
provide valuable system design and deployment insights, aiming to empower
practitioners in their pursuit of effective FL solutions.
Related papers
- Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework [1.4206132527980742]
Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.
We present the recent advances in developing APPFL, a framework and benchmarking suite for federated learning.
We demonstrate the capabilities of APPFL through extensive experiments evaluating various aspects of FL, including communication efficiency, privacy preservation, computational performance, and resource utilization.
arXiv Detail & Related papers (2024-09-17T22:20:26Z) - FedModule: A Modular Federated Learning Framework [5.872098693249397]
Federated learning (FL) has been widely adopted across various applications, such as healthcare, finance, and smart cities.
This paper introduces FedModule, a flexible and FL experimental framework.
FedModule adheres to the "one code, all scenarios" principle and employs a modular design that breaks the FL process into individual components.
arXiv Detail & Related papers (2024-09-07T15:03:12Z) - Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning? [50.03434441234569]
Federated Learning (FL) has gained significant popularity due to its effectiveness in training machine learning models across diverse sites without requiring direct data sharing.
While various algorithms have shown that FL with local updates is a communication-efficient distributed learning framework, the generalization performance of FL with local updates has received comparatively less attention.
arXiv Detail & Related papers (2024-09-05T19:00:18Z) - A Survey on Efficient Federated Learning Methods for Foundation Model Training [62.473245910234304]
Federated Learning (FL) has become an established technique to facilitate privacy-preserving collaborative training across a multitude of clients.
In the wake of Foundation Models (FM), the reality is different for many deep learning applications.
We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications.
arXiv Detail & Related papers (2024-01-09T10:22:23Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - ModularFed: Leveraging Modularity in Federated Learning Frameworks [8.139264167572213]
We propose a research-focused framework that addresses the complexity of Federated Learning (FL) implementations.
Within this architecture, protocols are blueprints that strictly define the framework's components' design.
Our protocols aim to enable modularity in FL, supporting third-party plug-and-play architecture and dynamic simulators.
arXiv Detail & Related papers (2022-10-31T10:21:19Z) - Test-Time Robust Personalization for Federated Learning [5.553167334488855]
Federated Learning (FL) is a machine learning paradigm where many clients collaboratively learn a shared global model with decentralized training data.
Personalized FL additionally adapts the global model to different clients, achieving promising results on consistent local training and test distributions.
We propose Federated Test-time Head Ensemble plus tuning(FedTHE+), which personalizes FL models with robustness to various test-time distribution shifts.
arXiv Detail & Related papers (2022-05-22T20:08:14Z) - FederatedScope: A Comprehensive and Flexible Federated Learning Platform
via Message Passing [63.87056362712879]
We propose a novel and comprehensive federated learning platform, named FederatedScope, which is based on a message-oriented framework.
Compared to the procedural framework, the proposed message-oriented framework is more flexible to express heterogeneous message exchange.
We conduct a series of experiments on the provided easy-to-use and comprehensive FL benchmarks to validate the correctness and efficiency of FederatedScope.
arXiv Detail & Related papers (2022-04-11T11:24:21Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.