Addressing Heterogeneity in Federated Learning via Distributional
Transformation
- URL: http://arxiv.org/abs/2210.15025v1
- Date: Wed, 26 Oct 2022 20:42:01 GMT
- Title: Addressing Heterogeneity in Federated Learning via Distributional
Transformation
- Authors: Haolin Yuan, Bo Hui, Yuchen Yang, Philippe Burlina, Neil Zhenqiang
Gong, and Yinzhi Cao
- Abstract summary: Federated learning (FL) allows multiple clients to collaboratively train a deep learning model.
One major challenge of FL is when data distribution is heterogeneous, i.e., differs from one client to another.
We propose a novel framework, called DisTrans, to improve FL performance (i.e., model accuracy) via train and test-time distributional transformations.
- Score: 37.99565338024758
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) allows multiple clients to collaboratively train a
deep learning model. One major challenge of FL is when data distribution is
heterogeneous, i.e., differs from one client to another. Existing personalized
FL algorithms are only applicable to narrow cases, e.g., one or two data
classes per client, and therefore they do not satisfactorily address FL under
varying levels of data heterogeneity. In this paper, we propose a novel
framework, called DisTrans, to improve FL performance (i.e., model accuracy)
via train and test-time distributional transformations along with a
double-input-channel model structure. DisTrans works by optimizing
distributional offsets and models for each FL client to shift their data
distribution, and aggregates these offsets at the FL server to further improve
performance in case of distributional heterogeneity. Our evaluation on multiple
benchmark datasets shows that DisTrans outperforms state-of-the-art FL methods
and data augmentation methods under various settings and different degrees of
client distributional heterogeneity.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - Tackling the Unlimited Staleness in Federated Learning with Intertwined Data and Device Heterogeneities [4.9851737525099225]
Federated Learning (FL) is often affected by both data and device heterogeneities.
In this paper, we present a new FL framework that leverages the gradient inversion technique for such conversion.
Our approach can significantly improve the trained model accuracy by up to 20% and speed up the FL training progress by up to 35%.
arXiv Detail & Related papers (2023-09-24T03:19:40Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal
Heterogeneous Federated Learning [37.96957782129352]
We propose a finetuning framework tailored to heterogeneous multi-modal foundation models, called Federated Dual-Aadapter Teacher (Fed DAT)
Fed DAT addresses data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer.
To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity.
arXiv Detail & Related papers (2023-08-21T21:57:01Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - Gradient Masked Averaging for Federated Learning [24.687254139644736]
Federated learning allows a large number of clients with heterogeneous data to coordinate learning of a unified global model.
Standard FL algorithms involve averaging of model parameters or gradient updates to approximate the global model at the server.
We propose a gradient masked averaging approach for FL as an alternative to the standard averaging of client updates.
arXiv Detail & Related papers (2022-01-28T08:42:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.