Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation
- URL: http://arxiv.org/abs/2312.05966v1
- Date: Sun, 10 Dec 2023 18:49:59 GMT
- Title: Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation
- Authors: Rui Ye, Yaxin Du, Zhenyang Ni, Siheng Chen, Yanfeng Wang
- Abstract summary: We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
- Score: 52.82176415223988
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In federated learning (FL), data heterogeneity is one key bottleneck that
causes model divergence and limits performance. Addressing this, existing
methods often regard data heterogeneity as an inherent property and propose to
mitigate its adverse effects by correcting models. In this paper, we seek to
break this inherent property by generating data to complement the original
dataset to fundamentally mitigate heterogeneity level. As a novel attempt from
the perspective of data, we propose federated learning with consensus-oriented
generation (FedCOG). FedCOG consists of two key components at the client side:
complementary data generation, which generates data extracted from the shared
global model to complement the original dataset, and
knowledge-distillation-based model training, which distills knowledge from
global model to local model based on the generated data to mitigate
over-fitting the original heterogeneous dataset. FedCOG has two critical
advantages: 1) it can be a plug-and-play module to further improve the
performance of most existing FL methods, and 2) it is naturally compatible with
standard FL protocols such as Secure Aggregation since it makes no modification
in communication process. Extensive experiments on classical and real-world FL
datasets show that FedCOG consistently outperforms state-of-the-art methods.
Related papers
- A Unified Solution to Diverse Heterogeneities in One-shot Federated Learning [14.466679488063217]
One-shot federated learning (FL) limits the communication between the server and clients to a single round.
We propose a unified, data-free, one-shot FL framework (FedHydra) that can effectively address both model and data heterogeneity.
arXiv Detail & Related papers (2024-10-28T15:20:52Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLIGAN: Enhancing Federated Learning with Incomplete Data using GAN [1.5749416770494706]
Federated Learning (FL) provides a privacy-preserving mechanism for distributed training of machine learning models on networked devices.
We propose FLIGAN, a novel approach to address the issue of data incompleteness in FL.
Our methodology adheres to FL's privacy requirements by generating synthetic data in a federated manner without sharing the actual data in the process.
arXiv Detail & Related papers (2024-03-25T16:49:38Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - Federated Virtual Learning on Heterogeneous Data with Local-global
Distillation [17.998623216905496]
Federated Virtual Learning on Heterogeneous Data with Local-Global Distillation (FedLGD)
We propose a new method, called Federated Virtual Learning on Heterogeneous Data with Local-Global Distillation (FedLGD)
Our method outperforms state-of-the-art heterogeneous FL algorithms under various settings with a very limited amount of distilled virtual data.
arXiv Detail & Related papers (2023-03-04T00:35:29Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Federated Causal Discovery [74.37739054932733]
This paper develops a gradient-based learning framework named DAG-Shared Federated Causal Discovery (DS-FCD)
It can learn the causal graph without directly touching local data and naturally handle the data heterogeneity.
Extensive experiments on both synthetic and real-world datasets verify the efficacy of the proposed method.
arXiv Detail & Related papers (2021-12-07T08:04:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.