Completely Heterogeneous Federated Learning
- URL: http://arxiv.org/abs/2210.15865v1
- Date: Fri, 28 Oct 2022 03:20:10 GMT
- Title: Completely Heterogeneous Federated Learning
- Authors: Chang Liu, Yuwen Yang, Xun Cai, Yue Ding, Hongtao Lu
- Abstract summary: Federated learning (FL) faces three major difficulties: cross-domain, heterogeneous models, and non-i.i.d. labels scenarios.
We propose the challenging "completely heterogeneous" scenario in FL, which refers to that each client will not expose any private information.
We then devise an FL framework based on parameter decoupling and data-free knowledge distillation to solve the problem.
- Score: 16.426770356031636
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) faces three major difficulties: cross-domain,
heterogeneous models, and non-i.i.d. labels scenarios. Existing FL methods fail
to handle the above three constraints at the same time, and the level of
privacy protection needs to be lowered (e.g., the model architecture and data
category distribution can be shared). In this work, we propose the challenging
"completely heterogeneous" scenario in FL, which refers to that each client
will not expose any private information including feature space, model
architecture, and label distribution. We then devise an FL framework based on
parameter decoupling and data-free knowledge distillation to solve the problem.
Experiments show that our proposed method achieves high performance in
completely heterogeneous scenarios where other approaches fail.
Related papers
- A Unified Solution to Diverse Heterogeneities in One-shot Federated Learning [14.466679488063217]
One-shot federated learning (FL) limits the communication between the server and clients to a single round.
We propose a unified, data-free, one-shot FL framework (FedHydra) that can effectively address both model and data heterogeneity.
arXiv Detail & Related papers (2024-10-28T15:20:52Z) - FedHPL: Efficient Heterogeneous Federated Learning with Prompt Tuning and Logit Distillation [32.305134875959226]
Federated learning (FL) is a privacy-preserving paradigm that enables distributed clients to collaboratively train models with a central server.
We propose FedHPL, a parameter-efficient unified $textbfFed$erated learning framework for $textbfH$eterogeneous settings.
We show that our framework outperforms state-of-the-art FL approaches, with less overhead and training rounds.
arXiv Detail & Related papers (2024-05-27T15:25:32Z) - Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation [52.82176415223988]
We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-10T18:49:59Z) - Federated Learning Empowered by Generative Content [55.576885852501775]
Federated learning (FL) enables leveraging distributed private data for model training in a privacy-preserving way.
We propose a novel FL framework termed FedGC, designed to mitigate data heterogeneity issues by diversifying private data with generative content.
We conduct a systematic empirical study on FedGC, covering diverse baselines, datasets, scenarios, and modalities.
arXiv Detail & Related papers (2023-12-10T07:38:56Z) - Federated Learning on Virtual Heterogeneous Data with Local-global Distillation [27.476131224950475]
We propose Federated Learning on Virtual Heterogeneous Data with Local-Global dataset Distillation (FedLGD)
Our method outperforms state-of-the-art heterogeneous FL algorithms under various settings.
arXiv Detail & Related papers (2023-03-04T00:35:29Z) - A Survey on Heterogeneous Federated Learning [12.395474890081232]
Federated learning (FL) has been proposed to protect data privacy and assemble isolated data silos by cooperatively training models among organizations without breaching privacy and security.
However, FL faces heterogeneous aspects, including data space, statistical, and system heterogeneity.
We propose a precise taxonomy of heterogeneous FL settings for each type of heterogeneity according to the problem setting and learning objective.
arXiv Detail & Related papers (2022-10-10T09:16:43Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - DistFL: Distribution-aware Federated Learning for Mobile Scenarios [14.638070213182655]
Federated learning (FL) has emerged as an effective solution to decentralized and privacy-preserving machine learning for mobile clients.
We propose textbfDistFL, a novel framework to achieve automated and accurate textbfDistrib-aware textbfFederated textbfLution.
arXiv Detail & Related papers (2021-10-22T06:58:48Z) - Multi-Center Federated Learning [62.32725938999433]
Federated learning (FL) can protect data privacy in distributed learning.
It merely collects local gradients from users without access to their data.
We propose a novel multi-center aggregation mechanism.
arXiv Detail & Related papers (2021-08-19T12:20:31Z) - FedSemi: An Adaptive Federated Semi-Supervised Learning Framework [23.90642104477983]
Federated learning (FL) has emerged as an effective technique to co-training machine learning models without actually sharing data and leaking privacy.
Most existing FL methods focus on the supervised setting and ignore the utilization of unlabeled data.
We propose FedSemi, a novel, adaptive, and general framework, which firstly introduces the consistency regularization into FL using a teacher-student model.
arXiv Detail & Related papers (2020-12-06T15:46:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.