FedGH: Heterogeneous Federated Learning with Generalized Global Header
- URL: http://arxiv.org/abs/2303.13137v2
- Date: Tue, 1 Aug 2023 16:30:48 GMT
- Title: FedGH: Heterogeneous Federated Learning with Generalized Global Header
- Authors: Liping Yi, Gang Wang, Xiaoguang Liu, Zhuan Shi, Han Yu
- Abstract summary: Federated learning (FL) is an emerging machine learning paradigm that allows multiple parties to train a shared model.
We propose a simple but effective Federated Global prediction Header (FedGH) approach.
FedGH trains a shared generalized global prediction header with representations by heterogeneous extractors for clients' models.
- Score: 16.26231633749833
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is an emerging machine learning paradigm that allows
multiple parties to train a shared model collaboratively in a
privacy-preserving manner. Existing horizontal FL methods generally assume that
the FL server and clients hold the same model structure. However, due to system
heterogeneity and the need for personalization, enabling clients to hold models
with diverse structures has become an important direction. Existing
model-heterogeneous FL approaches often require publicly available datasets and
incur high communication and/or computational costs, which limit their
performances. To address these limitations, we propose a simple but effective
Federated Global prediction Header (FedGH) approach. It is a communication and
computation-efficient model-heterogeneous FL framework which trains a shared
generalized global prediction header with representations extracted by
heterogeneous extractors for clients' models at the FL server. The trained
generalized global prediction header learns from different clients. The
acquired global knowledge is then transferred to clients to substitute each
client's local prediction header. We derive the non-convex convergence rate of
FedGH. Extensive experiments on two real-world datasets demonstrate that FedGH
achieves significantly more advantageous performance in both model-homogeneous
and -heterogeneous FL scenarios compared to seven state-of-the-art personalized
FL models, beating the best-performing baseline by up to 8.87% (for
model-homogeneous FL) and 1.83% (for model-heterogeneous FL) in terms of
average test accuracy, while saving up to 85.53% of communication overhead.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - pFedAFM: Adaptive Feature Mixture for Batch-Level Personalization in Heterogeneous Federated Learning [34.01721941230425]
We propose a model-heterogeneous personalized Federated learning approach with Adaptive Feature Mixture (pFedAFM) for supervised learning tasks.
It significantly outperforms 7 state-of-the-art MHPFL methods, achieving up to 7.93% accuracy improvement.
arXiv Detail & Related papers (2024-04-27T09:52:59Z) - FedImpro: Measuring and Improving Client Update in Federated Learning [77.68805026788836]
Federated Learning (FL) models often experience client drift caused by heterogeneous data.
We present an alternative perspective on client drift and aim to mitigate it by generating improved local models.
arXiv Detail & Related papers (2024-02-10T18:14:57Z) - FedSSA: Semantic Similarity-based Aggregation for Efficient Model-Heterogeneous Personalized Federated Learning [40.827571502726805]
Federated learning (FL) is a privacy-preserving collaboratively machine learning paradigm.
Model-Heterogeneous Personalized FL (MHPFL) has emerged to address this challenge.
Existing MHPFL approaches often rely on a public dataset with the same nature as the learning task, or incur high computation and communication costs.
We propose the Federated Semantic Similarity Aggregation (FedSSA) approach for supervised classification tasks.
FedSSA achieves up to 3.62% higher accuracy, 15.54 times higher communication efficiency, and 15.52 times higher computational efficiency compared to 7 state-of-the-art MHPFL baselines.
arXiv Detail & Related papers (2023-12-14T14:55:32Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - Closing the Gap between Client and Global Model Performance in
Heterogeneous Federated Learning [2.1044900734651626]
We show how the chosen approach for training custom client models has an impact on the global model.
We propose a new approach that combines KD and Learning without Forgetting (LwoF) to produce improved personalised models.
arXiv Detail & Related papers (2022-11-07T11:12:57Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - FedMR: Fedreated Learning via Model Recombination [7.404225808071622]
Federated Learning (FL) enables global model training across clients without compromising their confidential local data.
Existing FL methods rely on Federated Averaging (FedAvg)-based aggregation.
This paper proposes a novel and effective FL paradigm named FedMR (Federating Model Recombination)
arXiv Detail & Related papers (2022-08-16T11:30:19Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.