CSAFL: A Clustered Semi-Asynchronous Federated Learning Framework
- URL: http://arxiv.org/abs/2104.08184v1
- Date: Fri, 16 Apr 2021 15:51:02 GMT
- Title: CSAFL: A Clustered Semi-Asynchronous Federated Learning Framework
- Authors: Yu Zhang, Moming Duan, Duo Liu, Li Li, Ao Ren, Xianzhang Chen, Yujuan
Tan, Chengliang Wang
- Abstract summary: Federated learning (FL) is an emerging distributed machine learning paradigm that protects privacy and tackles the problem of isolated data islands.
There are two main communication strategies of FL: synchronous FL and asynchronous FL.
We propose a clustered semi-asynchronous federated learning framework.
- Score: 14.242716751043533
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning (FL) is an emerging distributed machine learning paradigm
that protects privacy and tackles the problem of isolated data islands. At
present, there are two main communication strategies of FL: synchronous FL and
asynchronous FL. The advantages of synchronous FL are that the model has high
precision and fast convergence speed. However, this synchronous communication
strategy has the risk that the central server waits too long for the devices,
namely, the straggler effect which has a negative impact on some time-critical
applications. Asynchronous FL has a natural advantage in mitigating the
straggler effect, but there are threats of model quality degradation and server
crash. Therefore, we combine the advantages of these two strategies to propose
a clustered semi-asynchronous federated learning (CSAFL) framework. We evaluate
CSAFL based on four imbalanced federated datasets in a non-IID setting and
compare CSAFL to the baseline methods. The experimental results show that CSAFL
significantly improves test accuracy by more than +5% on the four datasets
compared to TA-FedAvg. In particular, CSAFL improves absolute test accuracy by
+34.4% on non-IID FEMNIST compared to TA-FedAvg.
Related papers
- Robust Model Aggregation for Heterogeneous Federated Learning: Analysis and Optimizations [35.58487905412915]
We propose a time-driven SFL (T-SFL) framework for heterogeneous systems.
To evaluate the learning performance of T-SFL, we provide an upper bound on the global loss function.
We develop a discriminative model selection algorithm that removes local models from clients whose number of iterations falls below a predetermined threshold.
arXiv Detail & Related papers (2024-05-11T11:55:26Z) - AEDFL: Efficient Asynchronous Decentralized Federated Learning with
Heterogeneous Devices [61.66943750584406]
We propose an Asynchronous Efficient Decentralized FL framework, i.e., AEDFL, in heterogeneous environments.
First, we propose an asynchronous FL system model with an efficient model aggregation method for improving the FL convergence.
Second, we propose a dynamic staleness-aware model update approach to achieve superior accuracy.
Third, we propose an adaptive sparse training method to reduce communication and computation costs without significant accuracy degradation.
arXiv Detail & Related papers (2023-12-18T05:18:17Z) - Privacy-preserving Federated Primal-dual Learning for Non-convex and Non-smooth Problems with Model Sparsification [51.04894019092156]
Federated learning (FL) has been recognized as a rapidly growing area, where the model is trained over clients under the FL orchestration (PS)
In this paper, we propose a novel primal sparification algorithm for and guarantee non-smooth FL problems.
Its unique insightful properties and its analyses are also presented.
arXiv Detail & Related papers (2023-10-30T14:15:47Z) - Time-sensitive Learning for Heterogeneous Federated Edge Intelligence [52.83633954857744]
We investigate real-time machine learning in a federated edge intelligence (FEI) system.
FEI systems exhibit heterogenous communication and computational resource distribution.
We propose a time-sensitive federated learning (TS-FL) framework to minimize the overall run-time for collaboratively training a shared ML model.
arXiv Detail & Related papers (2023-01-26T08:13:22Z) - HFedMS: Heterogeneous Federated Learning with Memorable Data Semantics
in Industrial Metaverse [49.1501082763252]
This paper presents HFEDMS for incorporating practical FL into the emerging Industrial Metaverse.
It reduces data heterogeneity through dynamic grouping and training mode conversion.
Then, it compensates for the forgotten knowledge by fusing compressed historical data semantics.
Experiments have been conducted on the streamed non-i.i.d. FEMNIST dataset using 368 simulated devices.
arXiv Detail & Related papers (2022-11-07T04:33:24Z) - Semi-Synchronous Personalized Federated Learning over Mobile Edge
Networks [88.50555581186799]
We propose a semi-synchronous PFL algorithm, termed as Semi-Synchronous Personalized FederatedAveraging (PerFedS$2$), over mobile edge networks.
We derive an upper bound of the convergence rate of PerFedS2 in terms of the number of participants per global round and the number of rounds.
Experimental results verify the effectiveness of PerFedS2 in saving training time as well as guaranteeing the convergence of training loss.
arXiv Detail & Related papers (2022-09-27T02:12:43Z) - Time-triggered Federated Learning over Wireless Networks [48.389824560183776]
We present a time-triggered FL algorithm (TT-Fed) over wireless networks.
Our proposed TT-Fed algorithm improves the converged test accuracy by up to 12.5% and 5%, respectively.
arXiv Detail & Related papers (2022-04-26T16:37:29Z) - Towards Efficient and Stable K-Asynchronous Federated Learning with
Unbounded Stale Gradients on Non-IID Data [10.299577499118548]
Federated learning (FL) is an emerging privacy-preserving paradigm that enables multiple participants to train a global model without uploading raw data.
This paper proposes a two-stage weighted $K$ asynchronous FL with adaptive learning rate (WKAFL)
Experiments implemented on both benchmark and synthetic FL datasets show that WKAFL has better overall performance compared to existing algorithms.
arXiv Detail & Related papers (2022-03-02T16:17:23Z) - Stragglers Are Not Disaster: A Hybrid Federated Learning Algorithm with
Delayed Gradients [21.63719641718363]
Federated learning (FL) is a new machine learning framework which trains a joint model across a large amount of decentralized computing devices.
This paper presents a novel FL algorithm, namely Hybrid Federated Learning (HFL), to achieve a learning balance in efficiency and effectiveness.
arXiv Detail & Related papers (2021-02-12T02:27:44Z) - FedAT: A High-Performance and Communication-Efficient Federated Learning
System with Asynchronous Tiers [22.59875034596411]
We present FedAT, a novel Federated learning method with Asynchronous Tiers under Non-i.i.d. data.
FedAT minimizes the straggler effect with improved convergence speed and test accuracy.
Results show that FedAT improves the prediction performance by up to 21.09%, and reduces the communication cost by up to 8.5x, compared to state-of-the-art FL methods.
arXiv Detail & Related papers (2020-10-12T18:38:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.