Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator
- URL: http://arxiv.org/abs/2409.06955v2
- Date: Mon, 16 Sep 2024 08:23:09 GMT
- Title: Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator
- Authors: Kangyang Luo, Shuai Wang, Xiang Li, Yunshi Lan, Ming Gao, Jinlong Shu,
- Abstract summary: Federated Learning (FL) is gaining popularity as a distributed learning framework that only shares model parameters or updates and keeps private data locally.
We propose FedMD-CG, a novel FL method with highly competitive performance and high-level privacy preservation.
We conduct extensive experiments on various image classification tasks to validate the superiority of FedMD-CG.
- Score: 19.00239208095762
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) is gaining popularity as a distributed learning framework that only shares model parameters or gradient updates and keeps private data locally. However, FL is at risk of privacy leakage caused by privacy inference attacks. And most existing privacy-preserving mechanisms in FL conflict with achieving high performance and efficiency. Therefore, we propose FedMD-CG, a novel FL method with highly competitive performance and high-level privacy preservation, which decouples each client's local model into a feature extractor and a classifier, and utilizes a conditional generator instead of the feature extractor to perform server-side model aggregation. To ensure the consistency of local generators and classifiers, FedMD-CG leverages knowledge distillation to train local models and generators at both the latent feature level and the logit level. Also, we construct additional classification losses and design new diversity losses to enhance client-side training. FedMD-CG is robust to data heterogeneity and does not require training extra discriminators (like cGAN). We conduct extensive experiments on various image classification tasks to validate the superiority of FedMD-CG.
Related papers
- FLea: Addressing Data Scarcity and Label Skew in Federated Learning via Privacy-preserving Feature Augmentation [15.298650496155508]
Federated Learning (FL) enables model development by leveraging data distributed across numerous edge devices without transferring local data to a central server.
Existing FL methods face challenges when dealing with scarce and label-skewed data across devices, resulting in local model overfitting and drift.
We propose a pioneering framework called FLea, incorporating the following key components.
arXiv Detail & Related papers (2024-06-13T19:28:08Z) - Federated Learning Empowered by Generative Content [55.576885852501775]
Federated learning (FL) enables leveraging distributed private data for model training in a privacy-preserving way.
We propose a novel FL framework termed FedGC, designed to mitigate data heterogeneity issues by diversifying private data with generative content.
We conduct a systematic empirical study on FedGC, covering diverse baselines, datasets, scenarios, and modalities.
arXiv Detail & Related papers (2023-12-10T07:38:56Z) - DFRD: Data-Free Robustness Distillation for Heterogeneous Federated
Learning [20.135235291912185]
Federated Learning (FL) is a privacy-constrained decentralized machine learning paradigm.
We propose a new FL method (namely DFRD) to learn a robust global model in the data-heterogeneous and model-heterogeneous FL scenarios.
arXiv Detail & Related papers (2023-09-24T04:29:22Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - Selective Knowledge Sharing for Privacy-Preserving Federated
Distillation without A Good Teacher [52.2926020848095]
Federated learning is vulnerable to white-box attacks and struggles to adapt to heterogeneous clients.
This paper proposes a selective knowledge sharing mechanism for FD, termed Selective-FD.
arXiv Detail & Related papers (2023-04-04T12:04:19Z) - Federated Learning with Privacy-Preserving Ensemble Attention
Distillation [63.39442596910485]
Federated Learning (FL) is a machine learning paradigm where many local nodes collaboratively train a central model while keeping the training data decentralized.
We propose a privacy-preserving FL framework leveraging unlabeled public data for one-way offline knowledge distillation.
Our technique uses decentralized and heterogeneous local data like existing FL approaches, but more importantly, it significantly reduces the risk of privacy leakage.
arXiv Detail & Related papers (2022-10-16T06:44:46Z) - Preserving Privacy in Federated Learning with Ensemble Cross-Domain
Knowledge Distillation [22.151404603413752]
Federated Learning (FL) is a machine learning paradigm where local nodes collaboratively train a central model.
Existing FL methods typically share model parameters or employ co-distillation to address the issue of unbalanced data distribution.
We develop a privacy preserving and communication efficient method in a FL framework with one-shot offline knowledge distillation.
arXiv Detail & Related papers (2022-09-10T05:20:31Z) - Federated Learning on Heterogeneous and Long-Tailed Data via Classifier
Re-Training with Federated Features [24.679535905451758]
Federated learning (FL) provides a privacy-preserving solution for distributed machine learning tasks.
One challenging problem that severely damages the performance of FL models is the co-occurrence of data heterogeneity and long-tail distribution.
We propose a novel privacy-preserving FL method for heterogeneous and long-tailed data via Federated Re-training with Federated Features (CReFF)
arXiv Detail & Related papers (2022-04-28T10:35:11Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.