Parametric Feature Transfer: One-shot Federated Learning with Foundation
Models
- URL: http://arxiv.org/abs/2402.01862v1
- Date: Fri, 2 Feb 2024 19:34:46 GMT
- Title: Parametric Feature Transfer: One-shot Federated Learning with Foundation
Models
- Authors: Mahdi Beitollahi, Alex Bie, Sobhan Hemati, Leo Maxime Brunswic, Xu Li,
Xi Chen, Guojun Zhang
- Abstract summary: In one-shot federated learning, clients collaboratively train a global model in a single round of communication.
This paper introduces FedPFT, a methodology that harnesses the transferability of foundation models to enhance both accuracy and communication efficiency in one-shot FL.
- Score: 14.97955440815159
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In one-shot federated learning (FL), clients collaboratively train a global
model in a single round of communication. Existing approaches for one-shot FL
enhance communication efficiency at the expense of diminished accuracy. This
paper introduces FedPFT (Federated Learning with Parametric Feature Transfer),
a methodology that harnesses the transferability of foundation models to
enhance both accuracy and communication efficiency in one-shot FL. The approach
involves transferring per-client parametric models (specifically, Gaussian
mixtures) of features extracted from foundation models. Subsequently, each
parametric model is employed to generate synthetic features for training a
classifier head. Experimental results on eight datasets demonstrate that FedPFT
enhances the communication-accuracy frontier in both centralized and
decentralized FL scenarios, as well as across diverse data-heterogeneity
settings such as covariate shift and task shift, with improvements of up to
20.6%. Additionally, FedPFT adheres to the data minimization principle of FL,
as clients do not send real features. We demonstrate that sending real features
is vulnerable to potent reconstruction attacks. Moreover, we show that FedPFT
is amenable to formal privacy guarantees via differential privacy,
demonstrating favourable privacy-accuracy tradeoffs.
Related papers
- Tackling Feature-Classifier Mismatch in Federated Learning via Prompt-Driven Feature Transformation [12.19025665853089]
In traditional Federated Learning approaches, the global model underperforms when faced with data heterogeneity.
We propose a new PFL framework called FedPFT to address the mismatch problem while enhancing the quality of the feature extractor.
Our experiments demonstrate that FedPFT outperforms state-of-the-art methods by up to 7.08%.
arXiv Detail & Related papers (2024-07-23T02:52:52Z) - SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low computational Overhead [75.87007729801304]
SpaFL: a communication-efficient FL framework is proposed to optimize sparse model structures with low computational overhead.
Experiments show that SpaFL improves accuracy while requiring much less communication and computing resources compared to sparse baselines.
arXiv Detail & Related papers (2024-06-01T13:10:35Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - FedSSA: Semantic Similarity-based Aggregation for Efficient Model-Heterogeneous Personalized Federated Learning [40.827571502726805]
Federated learning (FL) is a privacy-preserving collaboratively machine learning paradigm.
Model-Heterogeneous Personalized FL (MHPFL) has emerged to address this challenge.
Existing MHPFL approaches often rely on a public dataset with the same nature as the learning task, or incur high computation and communication costs.
We propose the Federated Semantic Similarity Aggregation (FedSSA) approach for supervised classification tasks.
FedSSA achieves up to 3.62% higher accuracy, 15.54 times higher communication efficiency, and 15.52 times higher computational efficiency compared to 7 state-of-the-art MHPFL baselines.
arXiv Detail & Related papers (2023-12-14T14:55:32Z) - FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal
Heterogeneous Federated Learning [37.96957782129352]
We propose a finetuning framework tailored to heterogeneous multi-modal foundation models, called Federated Dual-Aadapter Teacher (Fed DAT)
Fed DAT addresses data heterogeneity by regularizing the client local updates and applying Mutual Knowledge Distillation (MKD) for an efficient knowledge transfer.
To demonstrate its effectiveness, we conduct extensive experiments on four multi-modality FL benchmarks with different types of data heterogeneity.
arXiv Detail & Related papers (2023-08-21T21:57:01Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - FedGH: Heterogeneous Federated Learning with Generalized Global Header [16.26231633749833]
Federated learning (FL) is an emerging machine learning paradigm that allows multiple parties to train a shared model.
We propose a simple but effective Federated Global prediction Header (FedGH) approach.
FedGH trains a shared generalized global prediction header with representations by heterogeneous extractors for clients' models.
arXiv Detail & Related papers (2023-03-23T09:38:52Z) - FedCliP: Federated Learning with Client Pruning [3.796320380104124]
Federated learning (FL) is a newly emerging distributed learning paradigm.
One fundamental bottleneck in FL is the heavy communication overheads between the distributed clients and the central server.
We propose FedCliP, the first communication efficient FL training framework from a macro perspective.
arXiv Detail & Related papers (2023-01-17T09:15:37Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Toward Understanding the Influence of Individual Clients in Federated
Learning [52.07734799278535]
Federated learning allows clients to jointly train a global model without sending their private data to a central server.
We defined a new notion called em-Influence, quantify this influence over parameters, and proposed an effective efficient model to estimate this metric.
arXiv Detail & Related papers (2020-12-20T14:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.