Federated Learning via Input-Output Collaborative Distillation
- URL: http://arxiv.org/abs/2312.14478v1
- Date: Fri, 22 Dec 2023 07:05:13 GMT
- Title: Federated Learning via Input-Output Collaborative Distillation
- Authors: Xuan Gong, Shanglin Li, Yuxiang Bao, Barry Yao, Yawen Huang, Ziyan Wu,
Baochang Zhang, Yefeng Zheng, David Doermann
- Abstract summary: Federated learning (FL) is a machine learning paradigm in which distributed local nodes collaboratively train a central model without sharing individually held private data.
We propose a data-free FL framework based on local-to-central collaborative distillation with direct input and output space exploitation.
- Score: 40.38454921071808
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a machine learning paradigm in which distributed
local nodes collaboratively train a central model without sharing individually
held private data. Existing FL methods either iteratively share local model
parameters or deploy co-distillation. However, the former is highly susceptible
to private data leakage, and the latter design relies on the prerequisites of
task-relevant real data. Instead, we propose a data-free FL framework based on
local-to-central collaborative distillation with direct input and output space
exploitation. Our design eliminates any requirement of recursive local
parameter exchange or auxiliary task-relevant data to transfer knowledge,
thereby giving direct privacy control to local users. In particular, to cope
with the inherent data heterogeneity across locals, our technique learns to
distill input on which each local model produces consensual yet unique results
to represent each expertise. Our proposed FL framework achieves notable
privacy-utility trade-offs with extensive experiments on image classification
and segmentation tasks under various real-world heterogeneous federated
learning settings on both natural and medical images.
Related papers
- Personalized Federated Learning for Cross-view Geo-localization [49.40531019551957]
We propose a methodology combining Federated Learning (FL) with Cross-view Image Geo-localization (CVGL) techniques.
Our method implements a coarse-to-fine approach, where clients share only the coarse feature extractors while keeping fine-grained features specific to local environments.
Results demonstrate that our federated CVGL method achieves performance close to centralized training while maintaining data privacy.
arXiv Detail & Related papers (2024-11-07T13:25:52Z) - pFedES: Model Heterogeneous Personalized Federated Learning with Feature
Extractor Sharing [19.403843478569303]
We propose a model-heterogeneous personalized Federated learning approach based on feature extractor sharing.
It incorporates a small homogeneous feature extractor into each client's heterogeneous local model.
It achieves 1.61% higher test accuracy, while reducing communication and computation costs by 99.6% and 82.9%, respectively.
arXiv Detail & Related papers (2023-11-12T15:43:39Z) - VFedMH: Vertical Federated Learning for Training Multiple Heterogeneous
Models [53.30484242706966]
This paper proposes a novel approach called Vertical federated learning for training multiple Heterogeneous models (VFedMH)
To protect the participants' local embedding values, we propose an embedding protection method based on lightweight blinding factors.
Experiments are conducted to demonstrate that VFedMH can simultaneously train multiple heterogeneous models with heterogeneous optimization and outperform some recent methods in model performance.
arXiv Detail & Related papers (2023-10-20T09:22:51Z) - Federated Virtual Learning on Heterogeneous Data with Local-global
Distillation [17.998623216905496]
Federated Virtual Learning on Heterogeneous Data with Local-Global Distillation (FedLGD)
We propose a new method, called Federated Virtual Learning on Heterogeneous Data with Local-Global Distillation (FedLGD)
Our method outperforms state-of-the-art heterogeneous FL algorithms under various settings with a very limited amount of distilled virtual data.
arXiv Detail & Related papers (2023-03-04T00:35:29Z) - Federated Learning with Privacy-Preserving Ensemble Attention
Distillation [63.39442596910485]
Federated Learning (FL) is a machine learning paradigm where many local nodes collaboratively train a central model while keeping the training data decentralized.
We propose a privacy-preserving FL framework leveraging unlabeled public data for one-way offline knowledge distillation.
Our technique uses decentralized and heterogeneous local data like existing FL approaches, but more importantly, it significantly reduces the risk of privacy leakage.
arXiv Detail & Related papers (2022-10-16T06:44:46Z) - Preserving Privacy in Federated Learning with Ensemble Cross-Domain
Knowledge Distillation [22.151404603413752]
Federated Learning (FL) is a machine learning paradigm where local nodes collaboratively train a central model.
Existing FL methods typically share model parameters or employ co-distillation to address the issue of unbalanced data distribution.
We develop a privacy preserving and communication efficient method in a FL framework with one-shot offline knowledge distillation.
arXiv Detail & Related papers (2022-09-10T05:20:31Z) - FedDC: Federated Learning with Non-IID Data via Local Drift Decoupling
and Correction [48.85303253333453]
Federated learning (FL) allows multiple clients to collectively train a high-performance global model without sharing their private data.
We propose a novel federated learning algorithm with local drift decoupling and correction (FedDC)
Our FedDC only introduces lightweight modifications in the local training phase, in which each client utilizes an auxiliary local drift variable to track the gap between the local model parameter and the global model parameters.
Experiment results and analysis demonstrate that FedDC yields expediting convergence and better performance on various image classification tasks.
arXiv Detail & Related papers (2022-03-22T14:06:26Z) - Personalization Improves Privacy-Accuracy Tradeoffs in Federated
Optimization [57.98426940386627]
We show that coordinating local learning with private centralized learning yields a generically useful and improved tradeoff between accuracy and privacy.
We illustrate our theoretical results with experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2022-02-10T20:44:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.