FedTGP: Trainable Global Prototypes with Adaptive-Margin-Enhanced
Contrastive Learning for Data and Model Heterogeneity in Federated Learning
- URL: http://arxiv.org/abs/2401.03230v1
- Date: Sat, 6 Jan 2024 14:43:47 GMT
- Title: FedTGP: Trainable Global Prototypes with Adaptive-Margin-Enhanced
Contrastive Learning for Data and Model Heterogeneity in Federated Learning
- Authors: Jianqing Zhang, Yang Liu, Yang Hua, and Jian Cao
- Abstract summary: Heterogeneous Federated Learning (HtFL) has attracted attention due to its ability to support heterogeneous models and data.
We introduce a novel HtFL approach called FedTGP, which leverages our Adaptive-margin-enhanced Contrastive Learning (ACL) to learn Trainable Global Prototypes (TGP) on the server.
- Score: 18.916282151435727
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Recently, Heterogeneous Federated Learning (HtFL) has attracted attention due
to its ability to support heterogeneous models and data. To reduce the high
communication cost of transmitting model parameters, a major challenge in HtFL,
prototype-based HtFL methods are proposed to solely share class
representatives, a.k.a, prototypes, among heterogeneous clients while
maintaining the privacy of clients' models. However, these prototypes are
naively aggregated into global prototypes on the server using weighted
averaging, resulting in suboptimal global knowledge which negatively impacts
the performance of clients. To overcome this challenge, we introduce a novel
HtFL approach called FedTGP, which leverages our Adaptive-margin-enhanced
Contrastive Learning (ACL) to learn Trainable Global Prototypes (TGP) on the
server. By incorporating ACL, our approach enhances prototype separability
while preserving semantic meaning. Extensive experiments with twelve
heterogeneous models demonstrate that our FedTGP surpasses state-of-the-art
methods by up to 9.08% in accuracy while maintaining the communication and
privacy advantages of prototype-based HtFL. Our code is available at
https://github.com/TsingZ0/FedTGP.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - Unlocking the Potential of Prompt-Tuning in Bridging Generalized and
Personalized Federated Learning [49.72857433721424]
Vision Transformers (ViT) and Visual Prompt Tuning (VPT) achieve state-of-the-art performance with improved efficiency in various computer vision tasks.
We present a novel algorithm, SGPT, that integrates Generalized FL (GFL) and Personalized FL (PFL) approaches by employing a unique combination of both shared and group-specific prompts.
arXiv Detail & Related papers (2023-10-27T17:22:09Z) - Inclusive Data Representation in Federated Learning: A Novel Approach
Integrating Textual and Visual Prompt [12.869146009608816]
We present Twin Prompt Federated learning (TPFL), a pioneering solution that integrates both visual and textual modalities.
In order to tackle the data heterogeneity issues, we introduce the Augmented TPFL (ATPFL), which not only enhances the global knowledge acquisition of client models but also fosters the development of robust, compact models.
The effectiveness of TPFL and ATPFL is substantiated by our extensive evaluations, consistently showing superior performance compared to all baselines.
arXiv Detail & Related papers (2023-10-04T11:20:28Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - FedPerfix: Towards Partial Model Personalization of Vision Transformers
in Federated Learning [9.950367271170592]
We investigate where and how to partially personalize a Vision Transformers (ViT) model.
Based on the insights that the self-attention layer and the classification head are the most sensitive parts of a ViT, we propose a novel approach called FedPerfix.
We evaluate the proposed approach on CIFAR-100, OrganAMNIST, and Office-Home datasets and demonstrate its effectiveness compared to several advanced PFL methods.
arXiv Detail & Related papers (2023-08-17T19:22:30Z) - Towards Instance-adaptive Inference for Federated Learning [80.38701896056828]
Federated learning (FL) is a distributed learning paradigm that enables multiple clients to learn a powerful global model by aggregating local training.
In this paper, we present a novel FL algorithm, i.e., FedIns, to handle intra-client data heterogeneity by enabling instance-adaptive inference in the FL framework.
Our experiments show that our FedIns outperforms state-of-the-art FL algorithms, e.g., a 6.64% improvement against the top-performing method with less than 15% communication cost on Tiny-ImageNet.
arXiv Detail & Related papers (2023-08-11T09:58:47Z) - Tackling Data Heterogeneity in Federated Learning with Class Prototypes [44.746340839025194]
We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization.
We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models.
arXiv Detail & Related papers (2022-12-06T05:15:38Z) - Closing the Gap between Client and Global Model Performance in
Heterogeneous Federated Learning [2.1044900734651626]
We show how the chosen approach for training custom client models has an impact on the global model.
We propose a new approach that combines KD and Learning without Forgetting (LwoF) to produce improved personalised models.
arXiv Detail & Related papers (2022-11-07T11:12:57Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.