Serverless Federated Learning with flwr-serverless
- URL: http://arxiv.org/abs/2310.15329v1
- Date: Mon, 23 Oct 2023 19:49:59 GMT
- Title: Serverless Federated Learning with flwr-serverless
- Authors: Sanjeev V. Namjoshi, Reese Green, Krishi Sharma, Zhangzhang Si
- Abstract summary: We introduce textttflwr-serverless, a wrapper around the Flower Python package to allow for both synchronous and asynchronous federated learning.
Our approach to federated learning allows the process to run without a central server, which increases the domains of application and accessibility of its use.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning is becoming increasingly relevant and popular as we
witness a surge in data collection and storage of personally identifiable
information. Alongside these developments there have been many proposals from
governments around the world to provide more protections for individuals' data
and a heightened interest in data privacy measures. As deep learning continues
to become more relevant in new and existing domains, it is vital to develop
strategies like federated learning that can effectively train data from
different sources, such as edge devices, without compromising security and
privacy. Recently, the Flower (\texttt{Flwr}) Python package was introduced to
provide a scalable, flexible, and easy-to-use framework for implementing
federated learning. However, to date, Flower is only able to run synchronous
federated learning which can be costly and time-consuming to run because the
process is bottlenecked by client-side training jobs that are slow or fragile.
Here, we introduce \texttt{flwr-serverless}, a wrapper around the Flower
package that extends its functionality to allow for both synchronous and
asynchronous federated learning with minimal modification to Flower's design
paradigm. Furthermore, our approach to federated learning allows the process to
run without a central server, which increases the domains of application and
accessibility of its use. This paper presents the design details and usage of
this approach through a series of experiments that were conducted using public
datasets. Overall, we believe that our approach decreases the time and cost to
run federated training and provides an easier way to implement and experiment
with federated learning systems.
Related papers
- Blockchain-enabled Trustworthy Federated Unlearning [50.01101423318312]
Federated unlearning is a promising paradigm for protecting the data ownership of distributed clients.
Existing works require central servers to retain the historical model parameters from distributed clients.
This paper proposes a new blockchain-enabled trustworthy federated unlearning framework.
arXiv Detail & Related papers (2024-01-29T07:04:48Z) - Over-the-Air Federated Learning In Broadband Communication [0.0]
Federated learning (FL) is a privacy-preserving distributed machine learning paradigm that operates at the wireless edge.
Some rely on secure multiparty computation, which can be vulnerable to inference attacks.
Others employ differential privacy, but this may lead to decreased test accuracy when dealing with a large number of parties contributing small amounts of data.
arXiv Detail & Related papers (2023-06-03T00:16:27Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - Towards Privacy-Aware Causal Structure Learning in Federated Setting [27.5652887311069]
We study a privacy-aware causal structure learning problem in the federated setting.
We propose a novel Federated PC (FedPC) algorithm with two new strategies for preserving data privacy without centralizing data.
arXiv Detail & Related papers (2022-11-13T14:54:42Z) - Practical Vertical Federated Learning with Unsupervised Representation
Learning [47.77625754666018]
Federated learning enables multiple parties to collaboratively train a machine learning model without sharing their raw data.
We propose a novel communication-efficient vertical federated learning algorithm named FedOnce, which requires only one-shot communication among parties.
Our privacy-preserving technique significantly outperforms the state-of-the-art approaches under the same privacy budget.
arXiv Detail & Related papers (2022-08-13T08:41:32Z) - Concept drift detection and adaptation for federated and continual
learning [55.41644538483948]
Smart devices can collect vast amounts of data from their environment.
This data is suitable for training machine learning models, which can significantly improve their behavior.
In this work, we present a new method, called Concept-Drift-Aware Federated Averaging.
arXiv Detail & Related papers (2021-05-27T17:01:58Z) - FedNLP: A Research Platform for Federated Learning in Natural Language
Processing [55.01246123092445]
We present the FedNLP, a research platform for federated learning in NLP.
FedNLP supports various popular task formulations in NLP such as text classification, sequence tagging, question answering, seq2seq generation, and language modeling.
Preliminary experiments with FedNLP reveal that there exists a large performance gap between learning on decentralized and centralized datasets.
arXiv Detail & Related papers (2021-04-18T11:04:49Z) - Adaptive Federated Dropout: Improving Communication Efficiency and
Generalization for Federated Learning [6.982736900950362]
A revolutionary decentralized machine learning setting, known as Federated Learning, enables multiple clients located at different geographical locations to collaboratively learn a machine learning model.
Communication between the clients and the server is considered a main bottleneck in the convergence time of federated learning.
We propose and study Adaptive Federated Dropout (AFD), a novel technique to reduce the communication costs associated with federated learning.
arXiv Detail & Related papers (2020-11-08T18:41:44Z) - IBM Federated Learning: an Enterprise Framework White Paper V0.1 [28.21579297214125]
Federated Learning (FL) is an approach to conduct machine learning without centralizing training data in a single place.
The framework applies to both Deep Neural Networks as well as traditional'' approaches for the most common machine learning libraries.
arXiv Detail & Related papers (2020-07-22T05:32:00Z) - Federated and continual learning for classification tasks in a society
of devices [59.45414406974091]
Light Federated and Continual Consensus (LFedCon2) is a new federated and continual architecture that uses light, traditional learners.
Our method allows powerless devices (such as smartphones or robots) to learn in real time, locally, continuously, autonomously and from users.
In order to test our proposal, we have applied it in a heterogeneous community of smartphone users to solve the problem of walking recognition.
arXiv Detail & Related papers (2020-06-12T12:37:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.