Federated Learning with a Single Shared Image
- URL: http://arxiv.org/abs/2406.12658v1
- Date: Tue, 18 Jun 2024 14:26:09 GMT
- Title: Federated Learning with a Single Shared Image
- Authors: Sunny Soni, Aaqib Saeed, Yuki M. Asano,
- Abstract summary: Federated Learning (FL) enables multiple machines to collaboratively train a machine learning model without sharing of private training data.
One popular method, FedDF, uses distillation to tackle this task with the use of a common, shared dataset.
In this paper, we introduce a new method that improves this knowledge distillation method to only rely on a single shared image.
- Score: 25.313809311019696
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) enables multiple machines to collaboratively train a machine learning model without sharing of private training data. Yet, especially for heterogeneous models, a key bottleneck remains the transfer of knowledge gained from each client model with the server. One popular method, FedDF, uses distillation to tackle this task with the use of a common, shared dataset on which predictions are exchanged. However, in many contexts such a dataset might be difficult to acquire due to privacy and the clients might not allow for storage of a large shared dataset. To this end, in this paper, we introduce a new method that improves this knowledge distillation method to only rely on a single shared image between clients and server. In particular, we propose a novel adaptive dataset pruning algorithm that selects the most informative crops generated from only a single image. With this, we show that federated learning with distillation under a limited shared dataset budget works better by using a single image compared to multiple individual ones. Finally, we extend our approach to allow for training heterogeneous client architectures by incorporating a non-uniform distillation schedule and client-model mirroring on the server side.
Related papers
- Personalized Federated Learning via Sequential Layer Expansion in Representation Learning [0.0]
Federated learning ensures the privacy of clients by conducting distributed training on individual client devices and sharing only the model weights with a central server.
We propose a new representation learning-based approach that suggests decoupling the entire deep learning model into more densely divided parts with the application of suitable scheduling methods.
arXiv Detail & Related papers (2024-04-27T06:37:19Z) - Federated Learning with Only Positive Labels by Exploring Label Correlations [78.59613150221597]
Federated learning aims to collaboratively learn a model by using the data from multiple users under privacy constraints.
In this paper, we study the multi-label classification problem under the federated learning setting.
We propose a novel and generic method termed Federated Averaging by exploring Label Correlations (FedALC)
arXiv Detail & Related papers (2024-04-24T02:22:50Z) - Heterogeneous Federated Learning Using Knowledge Codistillation [23.895665011884102]
We propose a method that involves training a small model on the entire pool and a larger model on a subset of clients with higher capacity.
The models exchange information bidirectionally via knowledge distillation, utilizing an unlabeled dataset on a server without sharing parameters.
arXiv Detail & Related papers (2023-10-04T03:17:26Z) - FedHP: Heterogeneous Federated Learning with Privacy-preserving [0.0]
Federated learning is a distributed machine learning environment, which ensures that clients complete collaborative training without sharing private data, only by exchanging parameters.
We propose a novel federated learning method, which consists of the pre-trained model as the backbone and fully connected layers as the head.
By sharing the embedding vector of classes, instead of parameters based on gradient space, clients can better adapt to private data, and it is more efficient in the communication between the server and clients.
arXiv Detail & Related papers (2023-01-27T13:32:17Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - Cross-domain Federated Object Detection [43.66352018668227]
Federated learning can enable multi-party collaborative learning without leaking client data.
We propose a cross-domain federated object detection framework, named FedOD.
arXiv Detail & Related papers (2022-06-30T03:09:59Z) - An Expectation-Maximization Perspective on Federated Learning [75.67515842938299]
Federated learning describes the distributed training of models across multiple clients while keeping the data private on-device.
In this work, we view the server-orchestrated federated learning process as a hierarchical latent variable model where the server provides the parameters of a prior distribution over the client-specific model parameters.
We show that with simple Gaussian priors and a hard version of the well known Expectation-Maximization (EM) algorithm, learning in such a model corresponds to FedAvg, the most popular algorithm for the federated learning setting.
arXiv Detail & Related papers (2021-11-19T12:58:59Z) - Federated Multi-Target Domain Adaptation [99.93375364579484]
Federated learning methods enable us to train machine learning models on distributed user data while preserving its privacy.
We consider a more practical scenario where the distributed client data is unlabeled, and a centralized labeled dataset is available on the server.
We propose an effective DualAdapt method to address the new challenges.
arXiv Detail & Related papers (2021-08-17T17:53:05Z) - Exploiting Shared Representations for Personalized Federated Learning [54.65133770989836]
We propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
Our algorithm harnesses the distributed computational power across clients to perform many local-updates with respect to the low-dimensional local parameters for every update of the representation.
This result is of interest beyond federated learning to a broad class of problems in which we aim to learn a shared low-dimensional representation among data distributions.
arXiv Detail & Related papers (2021-02-14T05:36:25Z) - Federated Residual Learning [53.77128418049985]
We study a new form of federated learning where the clients train personalized local models and make predictions jointly with the server-side shared model.
Using this new federated learning framework, the complexity of the central shared model can be minimized while still gaining all the performance benefits that joint training provides.
arXiv Detail & Related papers (2020-03-28T19:55:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.