Inclusive Data Representation in Federated Learning: A Novel Approach
Integrating Textual and Visual Prompt
- URL: http://arxiv.org/abs/2310.04455v1
- Date: Wed, 4 Oct 2023 11:20:28 GMT
- Title: Inclusive Data Representation in Federated Learning: A Novel Approach
Integrating Textual and Visual Prompt
- Authors: Zihao Zhao, Zhenpeng Shi, Yang Liu, Wenbo Ding
- Abstract summary: We present Twin Prompt Federated learning (TPFL), a pioneering solution that integrates both visual and textual modalities.
In order to tackle the data heterogeneity issues, we introduce the Augmented TPFL (ATPFL), which not only enhances the global knowledge acquisition of client models but also fosters the development of robust, compact models.
The effectiveness of TPFL and ATPFL is substantiated by our extensive evaluations, consistently showing superior performance compared to all baselines.
- Score: 12.869146009608816
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) is often impeded by communication overhead issues.
Prompt tuning, as a potential solution, has been introduced to only adjust a
few trainable parameters rather than the whole model. However, current
single-modality prompt tuning approaches fail to comprehensively portray local
clients' data. To overcome this limitation, we present Twin Prompt Federated
learning (TPFL), a pioneering solution that integrates both visual and textual
modalities, ensuring a more holistic representation of local clients' data
characteristics. Furthermore, in order to tackle the data heterogeneity issues,
we introduce the Augmented TPFL (ATPFL) employing the contrastive learning to
TPFL, which not only enhances the global knowledge acquisition of client models
but also fosters the development of robust, compact models. The effectiveness
of TPFL and ATPFL is substantiated by our extensive evaluations, consistently
showing superior performance compared to all baselines.
Related papers
- Beyond the Federation: Topology-aware Federated Learning for Generalization to Unseen Clients [10.397502254316645]
Federated learning is widely employed to tackle distributed sensitive data.
Topology-aware Federated Learning (TFL) trains robust models against out-of-federation (OOF) data.
We formulate a novel optimization problem for TFL, consisting of two key modules: Client Topology Learning and Learning on Client Topology.
Empirical evaluation on a variety of real-world datasets verifies TFL's superior OOF robustness and scalability.
arXiv Detail & Related papers (2024-07-06T03:57:05Z) - SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low computational Overhead [75.87007729801304]
SpaFL: a communication-efficient FL framework is proposed to optimize sparse model structures with low computational overhead.
Experiments show that SpaFL improves accuracy while requiring much less communication and computing resources compared to sparse baselines.
arXiv Detail & Related papers (2024-06-01T13:10:35Z) - FedMAP: Unlocking Potential in Personalized Federated Learning through Bi-Level MAP Optimization [11.040916982022978]
Federated Learning (FL) enables collaborative training of machine learning models on decentralized data.
Data across clients often differs significantly due to class imbalance, feature distribution skew, sample size imbalance, and other phenomena.
We propose a novel Bayesian PFL framework using bi-level optimization to tackle the data heterogeneity challenges.
arXiv Detail & Related papers (2024-05-29T11:28:06Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FedTGP: Trainable Global Prototypes with Adaptive-Margin-Enhanced
Contrastive Learning for Data and Model Heterogeneity in Federated Learning [18.916282151435727]
Heterogeneous Federated Learning (HtFL) has attracted attention due to its ability to support heterogeneous models and data.
We introduce a novel HtFL approach called FedTGP, which leverages our Adaptive-margin-enhanced Contrastive Learning (ACL) to learn Trainable Global Prototypes (TGP) on the server.
arXiv Detail & Related papers (2024-01-06T14:43:47Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - Unlocking the Potential of Prompt-Tuning in Bridging Generalized and
Personalized Federated Learning [49.72857433721424]
Vision Transformers (ViT) and Visual Prompt Tuning (VPT) achieve state-of-the-art performance with improved efficiency in various computer vision tasks.
We present a novel algorithm, SGPT, that integrates Generalized FL (GFL) and Personalized FL (PFL) approaches by employing a unique combination of both shared and group-specific prompts.
arXiv Detail & Related papers (2023-10-27T17:22:09Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.