TiFL: A Tier-based Federated Learning System
- URL: http://arxiv.org/abs/2001.09249v1
- Date: Sat, 25 Jan 2020 01:40:42 GMT
- Title: TiFL: A Tier-based Federated Learning System
- Authors: Zheng Chai, Ahsan Ali, Syed Zawad, Stacey Truex, Ali Anwar, Nathalie
Baracaldo, Yi Zhou, Heiko Ludwig, Feng Yan, Yue Cheng
- Abstract summary: Federated Learning (FL) enables learning a shared model across many clients without violating the privacy requirements.
We conduct a case study to show that heterogeneity in resource and data has a significant impact on training time and model accuracy in conventional FL systems.
We propose TiFL, a Tier-based Federated Learning System, which divides clients into tiers based on their training performance and selects clients from the same tier in each training round.
- Score: 17.74678728280232
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated Learning (FL) enables learning a shared model across many clients
without violating the privacy requirements. One of the key attributes in FL is
the heterogeneity that exists in both resource and data due to the differences
in computation and communication capacity, as well as the quantity and content
of data among different clients. We conduct a case study to show that
heterogeneity in resource and data has a significant impact on training time
and model accuracy in conventional FL systems. To this end, we propose TiFL, a
Tier-based Federated Learning System, which divides clients into tiers based on
their training performance and selects clients from the same tier in each
training round to mitigate the straggler problem caused by heterogeneity in
resource and data quantity. To further tame the heterogeneity caused by non-IID
(Independent and Identical Distribution) data and resources, TiFL employs an
adaptive tier selection approach to update the tiering on-the-fly based on the
observed training performance and accuracy overtime. We prototype TiFL in a FL
testbed following Google's FL architecture and evaluate it using popular
benchmarks and the state-of-the-art FL benchmark LEAF. Experimental evaluation
shows that TiFL outperforms the conventional FL in various heterogeneous
conditions. With the proposed adaptive tier selection policy, we demonstrate
that TiFL achieves much faster training performance while keeping the same (and
in some cases - better) test accuracy across the board.
Related papers
- TPFL: Tsetlin-Personalized Federated Learning with Confidence-Based Clustering [0.0]
We propose a novel approach called Tsetlin-Personalized Federated Learning.
In this way, models are grouped into clusters based on their confidence towards a specific class.
Clients share only what they are confident about, resulting in the elimination of wrongful weight aggregation.
Results demonstrated that TPFL performance better than baseline methods with 98.94% accuracy on MNIST, 98.52% accuracy on FashionMNIST and 91.16% accuracy on FEMNIST dataset.
arXiv Detail & Related papers (2024-09-16T15:27:35Z) - Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning? [50.03434441234569]
Federated Learning (FL) has gained significant popularity due to its effectiveness in training machine learning models across diverse sites without requiring direct data sharing.
While various algorithms have shown that FL with local updates is a communication-efficient distributed learning framework, the generalization performance of FL with local updates has received comparatively less attention.
arXiv Detail & Related papers (2024-09-05T19:00:18Z) - Multi-level Personalized Federated Learning on Heterogeneous and Long-Tailed Data [10.64629029156029]
We introduce an innovative personalized Federated Learning framework, Multi-level Personalized Federated Learning (MuPFL)
MuPFL integrates three pivotal modules: Biased Activation Value Dropout (BAVD), Adaptive Cluster-based Model Update (ACMU) and Prior Knowledge-assisted Fine-tuning (PKCF)
Experiments on diverse real-world datasets show that MuPFL consistently outperforms state-of-the-art baselines, even under extreme non-i.i.d. and long-tail conditions.
arXiv Detail & Related papers (2024-05-10T11:52:53Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - Automated Federated Learning in Mobile Edge Networks -- Fast Adaptation
and Convergence [83.58839320635956]
Federated Learning (FL) can be used in mobile edge networks to train machine learning models in a distributed manner.
Recent FL has been interpreted within a Model-Agnostic Meta-Learning (MAML) framework, which brings FL significant advantages in fast adaptation and convergence over heterogeneous datasets.
This paper addresses how much benefit MAML brings to FL and how to maximize such benefit over mobile edge networks.
arXiv Detail & Related papers (2023-03-23T02:42:10Z) - Stochastic Clustered Federated Learning [21.811496586350653]
This paper proposes StoCFL, a novel clustered federated learning approach for generic Non-IID issues.
In detail, StoCFL implements a flexible CFL framework that supports an arbitrary proportion of client participation and newly joined clients.
The results show that StoCFL could obtain promising cluster results even when the number of clusters is unknown.
arXiv Detail & Related papers (2023-03-02T01:39:16Z) - Federated Learning on Heterogeneous and Long-Tailed Data via Classifier
Re-Training with Federated Features [24.679535905451758]
Federated learning (FL) provides a privacy-preserving solution for distributed machine learning tasks.
One challenging problem that severely damages the performance of FL models is the co-occurrence of data heterogeneity and long-tail distribution.
We propose a novel privacy-preserving FL method for heterogeneous and long-tailed data via Federated Re-training with Federated Features (CReFF)
arXiv Detail & Related papers (2022-04-28T10:35:11Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - Towards Federated Learning on Time-Evolving Heterogeneous Data [13.080665001587281]
Federated Learning (FL) is an emerging learning paradigm that preserves privacy by ensuring client data locality on edge devices.
Despite recent research efforts on improving the optimization of heterogeneous data, the impact of time-evolving heterogeneous data in real-world scenarios has not been well studied.
We propose Continual Federated Learning (CFL), a flexible framework, to capture the time-evolving heterogeneity of FL.
arXiv Detail & Related papers (2021-12-25T14:58:52Z) - A Principled Approach to Data Valuation for Federated Learning [73.19984041333599]
Federated learning (FL) is a popular technique to train machine learning (ML) models on decentralized data sources.
The Shapley value (SV) defines a unique payoff scheme that satisfies many desiderata for a data value notion.
This paper proposes a variant of the SV amenable to FL, which we call the federated Shapley value.
arXiv Detail & Related papers (2020-09-14T04:37:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.