FLUTE: A Scalable, Extensible Framework for High-Performance Federated
Learning Simulations
- URL: http://arxiv.org/abs/2203.13789v1
- Date: Fri, 25 Mar 2022 17:15:33 GMT
- Title: FLUTE: A Scalable, Extensible Framework for High-Performance Federated
Learning Simulations
- Authors: Dimitrios Dimitriadis, Mirian Hipolito Garcia, Daniel Madrigal Diaz,
Andre Manoel, Robert Sim
- Abstract summary: "Federated Learning Utilities and Tools for Experimentation" (FLUTE) is a high-performance open source platform for federated learning research and offline simulations.
We describe the architecture of FLUTE, enabling arbitrary federated modeling schemes to be realized.
We demonstrate the effectiveness of the platform with a series of experiments for text prediction and speech recognition.
- Score: 12.121967768185684
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we introduce "Federated Learning Utilities and Tools for
Experimentation" (FLUTE), a high-performance open source platform for federated
learning research and offline simulations. The goal of FLUTE is to enable rapid
prototyping and simulation of new federated learning algorithms at scale,
including novel optimization, privacy, and communications strategies. We
describe the architecture of FLUTE, enabling arbitrary federated modeling
schemes to be realized, we compare the platform with other state-of-the-art
platforms, and we describe available features of FLUTE for experimentation in
core areas of active research, such as optimization, privacy and scalability.
We demonstrate the effectiveness of the platform with a series of experiments
for text prediction and speech recognition, including the addition of
differential privacy, quantization, scaling and a variety of optimization and
federation approaches.
Related papers
- Advances in APPFL: A Comprehensive and Extensible Federated Learning Framework [1.4206132527980742]
Federated learning (FL) is a distributed machine learning paradigm enabling collaborative model training while preserving data privacy.
We present the recent advances in developing APPFL, a framework and benchmarking suite for federated learning.
We demonstrate the capabilities of APPFL through extensive experiments evaluating various aspects of FL, including communication efficiency, privacy preservation, computational performance, and resource utilization.
arXiv Detail & Related papers (2024-09-17T22:20:26Z) - Federated Multi-View Synthesizing for Metaverse [52.59476179535153]
The metaverse is expected to provide immersive entertainment, education, and business applications.
Virtual reality (VR) transmission over wireless networks is data- and computation-intensive.
We have developed a novel multi-view synthesizing framework that can efficiently provide synthesizing, storage, and communication resources for wireless content delivery in the metaverse.
arXiv Detail & Related papers (2023-12-18T13:51:56Z) - UNIDEAL: Curriculum Knowledge Distillation Federated Learning [17.817181326740698]
Federated Learning (FL) has emerged as a promising approach to enable collaborative learning among multiple clients.
In this paper, we present UNI, a novel FL algorithm specifically designed to tackle the challenges of cross-domain scenarios.
Our results demonstrate that UNI achieves superior performance in terms of both model accuracy and communication efficiency.
arXiv Detail & Related papers (2023-09-16T11:30:29Z) - Adaptive Feature Fusion: Enhancing Generalization in Deep Learning
Models [0.0]
This paper introduces an innovative approach, Adaptive Feature Fusion (AFF), to enhance the generalization of deep learning models.
AFF is able to adaptively fuse features based on the underlying data characteristics and model requirements.
The analysis showcases the effectiveness of AFF in enhancing generalization capabilities, leading to improved performance across different tasks and applications.
arXiv Detail & Related papers (2023-04-04T21:41:38Z) - FS-Real: Towards Real-World Cross-Device Federated Learning [60.91678132132229]
Federated Learning (FL) aims to train high-quality models in collaboration with distributed clients while not uploading their local data.
There is still a considerable gap between the flourishing FL research and real-world scenarios, mainly caused by the characteristics of heterogeneous devices and its scales.
We propose an efficient and scalable prototyping system for real-world cross-device FL, FS-Real.
arXiv Detail & Related papers (2023-03-23T15:37:17Z) - FederatedScope: A Comprehensive and Flexible Federated Learning Platform
via Message Passing [63.87056362712879]
We propose a novel and comprehensive federated learning platform, named FederatedScope, which is based on a message-oriented framework.
Compared to the procedural framework, the proposed message-oriented framework is more flexible to express heterogeneous message exchange.
We conduct a series of experiments on the provided easy-to-use and comprehensive FL benchmarks to validate the correctness and efficiency of FederatedScope.
arXiv Detail & Related papers (2022-04-11T11:24:21Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - A Field Guide to Federated Optimization [161.3779046812383]
Federated learning and analytics are a distributed approach for collaboratively learning models (or statistics) from decentralized data.
This paper provides recommendations and guidelines on formulating, designing, evaluating and analyzing federated optimization algorithms.
arXiv Detail & Related papers (2021-07-14T18:09:08Z) - FedScale: Benchmarking Model and System Performance of Federated
Learning [4.1617240682257925]
FedScale is a set of challenging and realistic benchmark datasets for federated learning (FL) research.
FedScale is open-source with permissive licenses and actively maintained.
arXiv Detail & Related papers (2021-05-24T15:55:27Z) - Edge-assisted Democratized Learning Towards Federated Analytics [67.44078999945722]
We show the hierarchical learning structure of the proposed edge-assisted democratized learning mechanism, namely Edge-DemLearn.
We also validate Edge-DemLearn as a flexible model training mechanism to build a distributed control and aggregation methodology in regions.
arXiv Detail & Related papers (2020-12-01T11:46:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.