Federated Learning in Non-IID Settings Aided by Differentially Private
Synthetic Data
- URL: http://arxiv.org/abs/2206.00686v2
- Date: Thu, 20 Apr 2023 01:50:34 GMT
- Title: Federated Learning in Non-IID Settings Aided by Differentially Private
Synthetic Data
- Authors: Huancheng Chen and Haris Vikalo
- Abstract summary: Federated learning (FL) is a privacy-promoting framework that enables clients to collaboratively train machine learning models.
A major challenge in federated learning arises when the local data is heterogeneous.
We propose FedDPMS, an FL algorithm in which clients deploy variational auto-encoders to augment local datasets with data synthesized using differentially private means of latent data representations.
- Score: 20.757477553095637
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a privacy-promoting framework that enables
potentially large number of clients to collaboratively train machine learning
models. In a FL system, a server coordinates the collaboration by collecting
and aggregating clients' model updates while the clients' data remains local
and private. A major challenge in federated learning arises when the local data
is heterogeneous -- the setting in which performance of the learned global
model may deteriorate significantly compared to the scenario where the data is
identically distributed across the clients. In this paper we propose FedDPMS
(Federated Differentially Private Means Sharing), an FL algorithm in which
clients deploy variational auto-encoders to augment local datasets with data
synthesized using differentially private means of latent data representations
communicated by a trusted server. Such augmentation ameliorates effects of data
heterogeneity across the clients without compromising privacy. Our experiments
on deep image classification tasks demonstrate that FedDPMS outperforms
competing state-of-the-art FL methods specifically designed for heterogeneous
data settings.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - Personalized Federated Learning with Attention-based Client Selection [57.71009302168411]
We propose FedACS, a new PFL algorithm with an Attention-based Client Selection mechanism.
FedACS integrates an attention mechanism to enhance collaboration among clients with similar data distributions.
Experiments on CIFAR10 and FMNIST validate FedACS's superiority.
arXiv Detail & Related papers (2023-12-23T03:31:46Z) - DCFL: Non-IID awareness Data Condensation aided Federated Learning [0.8158530638728501]
Federated learning is a decentralized learning paradigm wherein a central server trains a global model iteratively by utilizing clients who possess a certain amount of private datasets.
The challenge lies in the fact that the client side private data may not be identically and independently distributed.
We propose DCFL which divides clients into groups by using the Centered Kernel Alignment (CKA) method, then uses dataset condensation methods with non-IID awareness to complete clients.
arXiv Detail & Related papers (2023-12-21T13:04:24Z) - Federated Learning Empowered by Generative Content [55.576885852501775]
Federated learning (FL) enables leveraging distributed private data for model training in a privacy-preserving way.
We propose a novel FL framework termed FedGC, designed to mitigate data heterogeneity issues by diversifying private data with generative content.
We conduct a systematic empirical study on FedGC, covering diverse baselines, datasets, scenarios, and modalities.
arXiv Detail & Related papers (2023-12-10T07:38:56Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - PS-FedGAN: An Efficient Federated Learning Framework Based on Partially
Shared Generative Adversarial Networks For Data Privacy [56.347786940414935]
Federated Learning (FL) has emerged as an effective learning paradigm for distributed computation.
This work proposes a novel FL framework that requires only partial GAN model sharing.
Named as PS-FedGAN, this new framework enhances the GAN releasing and training mechanism to address heterogeneous data distributions.
arXiv Detail & Related papers (2023-05-19T05:39:40Z) - Personalized Privacy-Preserving Framework for Cross-Silo Federated
Learning [0.0]
Federated learning (FL) is a promising decentralized deep learning (DL) framework that enables DL-based approaches trained collaboratively across clients without sharing private data.
In this paper, we propose a novel framework, namely Personalized Privacy-Preserving Federated Learning (PPPFL)
Our proposed framework outperforms multiple FL baselines on different datasets, including MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100.
arXiv Detail & Related papers (2023-02-22T07:24:08Z) - The Best of Both Worlds: Accurate Global and Personalized Models through
Federated Learning with Data-Free Hyper-Knowledge Distillation [17.570719572024608]
FedHKD (Federated Hyper-Knowledge Distillation) is a novel FL algorithm in which clients rely on knowledge distillation to train local models.
Unlike other KD-based pFL methods, FedHKD does not rely on a public dataset nor it deploys a generative model at the server.
We conduct extensive experiments on visual datasets in a variety of scenarios, demonstrating that FedHKD provides significant improvement in both personalized as well as global model performance.
arXiv Detail & Related papers (2023-01-21T16:20:57Z) - FRAug: Tackling Federated Learning with Non-IID Features via
Representation Augmentation [31.12851987342467]
Federated Learning (FL) is a decentralized learning paradigm in which multiple clients collaboratively train deep learning models.
We propose Federated Representation Augmentation (FRAug) to tackle this practical and challenging problem.
Our approach generates synthetic client-specific samples in the embedding space to augment the usually small client datasets.
arXiv Detail & Related papers (2022-05-30T07:43:42Z) - FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling
and Correction [48.85303253333453]
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
We propose a novel federated learning algorithm with local drift decoupling and correction (FedDC)
Our FedDC only introduces lightweight modifications in the local training phase, in which each client utilizes an auxiliary local drift variable to track the gap between the local model parameter and the global model parameters.
Experiment results and analysis demonstrate that FedDC yields expediting convergence and better performance on various image classification tasks.
arXiv Detail & Related papers (2022-03-22T14:06:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.