Flexible Clustered Federated Learning for Client-Level Data Distribution
Shift
- URL: http://arxiv.org/abs/2108.09749v1
- Date: Sun, 22 Aug 2021 15:11:39 GMT
- Title: Flexible Clustered Federated Learning for Client-Level Data Distribution
Shift
- Authors: Moming Duan, Duo Liu, Xinyuan Ji, Yu Wu, Liang Liang, Xianzhang Chen,
Yujuan Tan
- Abstract summary: We propose a flexible clustered federated learning (CFL) framework named FlexCFL.
We show that FlexCFL can significantly improve absolute test accuracy by +10.6% on FEMNIST compared to FedAvg.
We also evaluate FlexCFL on several open datasets and made comparisons with related CFL frameworks.
- Score: 13.759582953827229
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) enables the multiple participating devices to
collaboratively contribute to a global neural network model while keeping the
training data locally. Unlike the centralized training setting, the non-IID,
imbalanced (statistical heterogeneity) and distribution shifted training data
of FL is distributed in the federated network, which will increase the
divergences between the local models and the global model, further degrading
performance. In this paper, we propose a flexible clustered federated learning
(CFL) framework named FlexCFL, in which we 1) group the training of clients
based on the similarities between the clients' optimization directions for
lower training divergence; 2) implement an efficient newcomer device cold start
mechanism for framework scalability and practicality; 3) flexibly migrate
clients to meet the challenge of client-level data distribution shift. FlexCFL
can achieve improvements by dividing joint optimization into groups of
sub-optimization and can strike a balance between accuracy and communication
efficiency in the distribution shift environment. The convergence and
complexity are analyzed to demonstrate the efficiency of FlexCFL. We also
evaluate FlexCFL on several open datasets and made comparisons with related CFL
frameworks. The results show that FlexCFL can significantly improve absolute
test accuracy by +10.6% on FEMNIST compared to FedAvg, +3.5% on FashionMNIST
compared to FedProx, +8.4% on MNIST compared to FeSEM. The experiment results
show that FlexCFL is also communication efficient in the distribution shift
environment.
Related papers
- FedClust: Tackling Data Heterogeneity in Federated Learning through Weight-Driven Client Clustering [26.478852701376294]
Federated learning (FL) is an emerging distributed machine learning paradigm.
One of the major challenges in FL is the presence of uneven data distributions across client devices.
We propose em FedClust, a novel approach for CFL that leverages the correlation between local model weights and the data distribution of clients.
arXiv Detail & Related papers (2024-07-09T02:47:16Z) - Embracing Federated Learning: Enabling Weak Client Participation via Partial Model Training [21.89214794178211]
In Federated Learning (FL), clients may have weak devices that cannot train the full model or even hold it in their memory space.
We propose EmbracingFL, a general FL framework that allows all available clients to join the distributed training.
Our empirical study shows that EmbracingFL consistently achieves high accuracy as like all clients are strong, outperforming the state-of-the-art width reduction methods.
arXiv Detail & Related papers (2024-06-21T13:19:29Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FedClust: Optimizing Federated Learning on Non-IID Data through
Weight-Driven Client Clustering [28.057411252785176]
Federated learning (FL) is an emerging distributed machine learning paradigm enabling collaborative model training on decentralized devices without exposing their local data.
This paper proposes FedClust, a novel CFL approach leveraging correlations between local model weights and client data distributions.
arXiv Detail & Related papers (2024-03-07T01:50:36Z) - FedLALR: Client-Specific Adaptive Learning Rates Achieve Linear Speedup
for Non-IID Data [54.81695390763957]
Federated learning is an emerging distributed machine learning method.
We propose a heterogeneous local variant of AMSGrad, named FedLALR, in which each client adjusts its learning rate.
We show that our client-specified auto-tuned learning rate scheduling can converge and achieve linear speedup with respect to the number of clients.
arXiv Detail & Related papers (2023-09-18T12:35:05Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - Stochastic Clustered Federated Learning [21.811496586350653]
This paper proposes StoCFL, a novel clustered federated learning approach for generic Non-IID issues.
In detail, StoCFL implements a flexible CFL framework that supports an arbitrary proportion of client participation and newly joined clients.
The results show that StoCFL could obtain promising cluster results even when the number of clusters is unknown.
arXiv Detail & Related papers (2023-03-02T01:39:16Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized
Floating Aggregation Point [51.47520726446029]
cooperative edge learning (CE-FL) is a distributed machine learning architecture.
We model the processes taken during CE-FL, and conduct analytical training.
We show the effectiveness of our framework with the data collected from a real-world testbed.
arXiv Detail & Related papers (2022-03-26T00:41:57Z) - FedGroup: Efficient Clustered Federated Learning via Decomposed
Data-Driven Measure [18.083188787905083]
We propose a novel clustered federated learning (CFL) framework FedGroup.
We show that FedGroup can significantly improve absolute test accuracy by +14.1% on FEMNIST compared to FedAvg.
We also evaluate FedGroup and FedGrouProx (combined with FedProx) on several open datasets.
arXiv Detail & Related papers (2020-10-14T08:15:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.