FedD2S: Personalized Data-Free Federated Knowledge Distillation
- URL: http://arxiv.org/abs/2402.10846v1
- Date: Fri, 16 Feb 2024 17:36:51 GMT
- Title: FedD2S: Personalized Data-Free Federated Knowledge Distillation
- Authors: Kawa Atapour, S. Jamal Seyedmohammadi, Jamshid Abouei, Arash
Mohammadi, Konstantinos N. Plataniotis
- Abstract summary: We propose a novel approach named FedD2S for Personalized Federated Learning (pFL), leveraging knowledge distillation.
FedD2S incorporates a deep-to-shallow layer-dropping mechanism in the data-free knowledge distillation process to enhance local model personalization.
The proposed approach demonstrates superior performance, characterized by accelerated convergence and improved fairness among clients.
- Score: 19.975420988169454
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper addresses the challenge of mitigating data heterogeneity among
clients within a Federated Learning (FL) framework. The model-drift issue,
arising from the noniid nature of client data, often results in suboptimal
personalization of a global model compared to locally trained models for each
client. To tackle this challenge, we propose a novel approach named FedD2S for
Personalized Federated Learning (pFL), leveraging knowledge distillation.
FedD2S incorporates a deep-to-shallow layer-dropping mechanism in the data-free
knowledge distillation process to enhance local model personalization. Through
extensive simulations on diverse image datasets-FEMNIST, CIFAR10, CINIC0, and
CIFAR100-we compare FedD2S with state-of-the-art FL baselines. The proposed
approach demonstrates superior performance, characterized by accelerated
convergence and improved fairness among clients. The introduced layer-dropping
technique effectively captures personalized knowledge, resulting in enhanced
performance compared to alternative FL models. Moreover, we investigate the
impact of key hyperparameters, such as the participation ratio and
layer-dropping rate, providing valuable insights into the optimal configuration
for FedD2S. The findings demonstrate the efficacy of adaptive layer-dropping in
the knowledge distillation process to achieve enhanced personalization and
performance across diverse datasets and tasks.
Related papers
- Adversarial Federated Consensus Learning for Surface Defect Classification Under Data Heterogeneity in IIoT [8.48069043458347]
It's difficult to collect and centralize sufficient training data from various entities in Industrial Internet of Things (IIoT)
Federated learning (FL) provides a solution by enabling collaborative global model training across clients.
We propose a novel personalized FL approach, named Adversarial Federated Consensus Learning (AFedCL)
arXiv Detail & Related papers (2024-09-24T03:59:32Z) - FedMAP: Unlocking Potential in Personalized Federated Learning through Bi-Level MAP Optimization [11.040916982022978]
Federated Learning (FL) enables collaborative training of machine learning models on decentralized data.
Data across clients often differs significantly due to class imbalance, feature distribution skew, sample size imbalance, and other phenomena.
We propose a novel Bayesian PFL framework using bi-level optimization to tackle the data heterogeneity challenges.
arXiv Detail & Related papers (2024-05-29T11:28:06Z) - Stable Diffusion-based Data Augmentation for Federated Learning with Non-IID Data [9.045647166114916]
Federated Learning (FL) is a promising paradigm for decentralized and collaborative model training.
FL struggles with a significant performance reduction and poor convergence when confronted with Non-Independent and Identically Distributed (Non-IID) data distributions.
We introduce Gen-FedSD, a novel approach that harnesses the powerful capability of state-of-the-art text-to-image foundation models.
arXiv Detail & Related papers (2024-05-13T16:57:48Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FedImpro: Measuring and Improving Client Update in Federated Learning [77.68805026788836]
Federated Learning (FL) models often experience client drift caused by heterogeneous data.
We present an alternative perspective on client drift and aim to mitigate it by generating improved local models.
arXiv Detail & Related papers (2024-02-10T18:14:57Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - WAFFLE: Weighted Averaging for Personalized Federated Learning [38.241216472571786]
We introduce WAFFLE, a personalized collaborative machine learning algorithm based on SCAFFOLD.
WAFFLE uses the Euclidean distance between clients' updates to weigh their individual contributions.
Our experiments demonstrate the effectiveness of WAFFLE compared with other methods.
arXiv Detail & Related papers (2021-10-13T18:40:54Z) - Toward Understanding the Influence of Individual Clients in Federated
Learning [52.07734799278535]
Federated learning allows clients to jointly train a global model without sending their private data to a central server.
We defined a new notion called em-Influence, quantify this influence over parameters, and proposed an effective efficient model to estimate this metric.
arXiv Detail & Related papers (2020-12-20T14:34:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.