On-device Federated Learning with Flower
- URL: http://arxiv.org/abs/2104.03042v1
- Date: Wed, 7 Apr 2021 10:42:14 GMT
- Title: On-device Federated Learning with Flower
- Authors: Akhil Mathur, Daniel J. Beutel, Pedro Porto Buarque de Gusm\~ao,
Javier Fernandez-Marques, Taner Topal, Xinchi Qiu, Titouan Parcollet, Yan
Gao, Nicholas D. Lane
- Abstract summary: Federated Learning (FL) allows edge devices to collaboratively learn a shared prediction model while keeping their training data on the device.
Despite the algorithmic advancements in FL, the support for on-device training of FL algorithms on edge devices remains poor.
We present an exploration of on-device FL on various smartphones and embedded devices using the Flower framework.
- Score: 22.719117235237036
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) allows edge devices to collaboratively learn a shared
prediction model while keeping their training data on the device, thereby
decoupling the ability to do machine learning from the need to store data in
the cloud. Despite the algorithmic advancements in FL, the support for
on-device training of FL algorithms on edge devices remains poor. In this
paper, we present an exploration of on-device FL on various smartphones and
embedded devices using the Flower framework. We also evaluate the system costs
of on-device FL and discuss how this quantification could be used to design
more efficient FL algorithms.
Related papers
- Federated Learning for 6G: Paradigms, Taxonomy, Recent Advances and
Insights [52.024964564408]
This paper examines the added-value of implementing Federated Learning throughout all levels of the protocol stack.
It presents important FL applications, addresses hot topics, provides valuable insights and explicits guidance for future research and developments.
Our concluding remarks aim to leverage the synergy between FL and future 6G, while highlighting FL's potential to revolutionize wireless industry.
arXiv Detail & Related papers (2023-12-07T20:39:57Z) - Ed-Fed: A generic federated learning framework with resource-aware
client selection for edge devices [0.6875312133832078]
Federated learning (FL) has evolved as a prominent method for edge devices to cooperatively create a unified prediction model.
Despite numerous research frameworks for simulating FL algorithms, they do not facilitate comprehensive deployment for automatic speech recognition tasks.
This is where Ed-Fed, a comprehensive and generic FL framework, comes in as a foundation for future practical FL system research.
arXiv Detail & Related papers (2023-07-14T07:19:20Z) - Automated Federated Learning in Mobile Edge Networks -- Fast Adaptation
and Convergence [83.58839320635956]
Federated Learning (FL) can be used in mobile edge networks to train machine learning models in a distributed manner.
Recent FL has been interpreted within a Model-Agnostic Meta-Learning (MAML) framework, which brings FL significant advantages in fast adaptation and convergence over heterogeneous datasets.
This paper addresses how much benefit MAML brings to FL and how to maximize such benefit over mobile edge networks.
arXiv Detail & Related papers (2023-03-23T02:42:10Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - APPFL: Open-Source Software Framework for Privacy-Preserving Federated
Learning [0.0]
Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in classical machine learning.
We introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework.
APPFL allows users to leverage implemented privacy-preserving algorithms, implement new algorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques.
arXiv Detail & Related papers (2022-02-08T06:23:05Z) - FL_PyTorch: optimization research simulator for federated learning [1.6114012813668934]
Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared machine learning model.
FL_PyTorch is a suite of open-source software written in python that builds on top of one the most popular research Deep Learning (DL) framework PyTorch.
arXiv Detail & Related papers (2022-02-07T12:18:28Z) - To Talk or to Work: Flexible Communication Compression for Energy
Efficient Federated Learning over Heterogeneous Mobile Edge Devices [78.38046945665538]
federated learning (FL) over massive mobile edge devices opens new horizons for numerous intelligent mobile applications.
FL imposes huge communication and computation burdens on participating devices due to periodical global synchronization and continuous local training.
We develop a convergence-guaranteed FL algorithm enabling flexible communication compression.
arXiv Detail & Related papers (2020-12-22T02:54:18Z) - Flower: A Friendly Federated Learning Research Framework [18.54638343801354]
Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared prediction model.
We present Flower -- a comprehensive FL framework that distinguishes itself from existing platforms by offering new facilities to execute large-scale FL experiments.
arXiv Detail & Related papers (2020-07-28T17:59:07Z) - FedML: A Research Library and Benchmark for Federated Machine Learning [55.09054608875831]
Federated learning (FL) is a rapidly growing research field in machine learning.
Existing FL libraries cannot adequately support diverse algorithmic development.
We introduce FedML, an open research library and benchmark to facilitate FL algorithm development and fair performance comparison.
arXiv Detail & Related papers (2020-07-27T13:02:08Z) - Wireless Communications for Collaborative Federated Learning [160.82696473996566]
Internet of Things (IoT) devices may not be able to transmit their collected data to a central controller for training machine learning models.
Google's seminal FL algorithm requires all devices to be directly connected with a central controller.
This paper introduces a novel FL framework, called collaborative FL (CFL), which enables edge devices to implement FL with less reliance on a central controller.
arXiv Detail & Related papers (2020-06-03T20:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.