FedProto: Federated Prototype Learning over Heterogeneous Devices
- URL: http://arxiv.org/abs/2105.00243v1
- Date: Sat, 1 May 2021 13:21:56 GMT
- Title: FedProto: Federated Prototype Learning over Heterogeneous Devices
- Authors: Yue Tan, Guodong Long, Lu Liu, Tianyi Zhou and Jing Jiang
- Abstract summary: We propose a novel federated prototype learning (FedProto) framework in which the devices and server communicate the class prototypes instead of the gradients.
FedProto aggregates the local prototypes collected from different devices, and then sends the global prototypes back to all devices to regularize the training of local models.
The training on each device aims to minimize the classification error on the local data while keeping the resulting local prototypes sufficiently close to the corresponding global ones.
- Score: 40.10333186507569
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The heterogeneity across devices usually hinders the optimization convergence
and generalization performance of federated learning (FL) when the aggregation
of devices' knowledge occurs in the gradient space. For example, devices may
differ in terms of data distribution, network latency, input/output space,
and/or model architecture, which can easily lead to the misalignment of their
local gradients. To improve the tolerance to heterogeneity, we propose a novel
federated prototype learning (FedProto) framework in which the devices and
server communicate the class prototypes instead of the gradients. FedProto
aggregates the local prototypes collected from different devices, and then
sends the global prototypes back to all devices to regularize the training of
local models. The training on each device aims to minimize the classification
error on the local data while keeping the resulting local prototypes
sufficiently close to the corresponding global ones. Through experiments, we
propose a benchmark setting tailored for heterogeneous FL, with FedProto
outperforming several recent FL approaches on multiple datasets.
Related papers
- FedAli: Personalized Federated Learning with Aligned Prototypes through Optimal Transport [9.683642138601464]
Federated Learning (FL) enables collaborative, personalized model training across multiple devices without sharing raw data.
We introduce the Alignment with Prototypes layers, which align incoming embeddings closer to learnable prototypes.
We evaluate FedAli on heterogeneous sensor-based human activity recognition and vision benchmark datasets, demonstrating that it outperforms existing FL strategies.
arXiv Detail & Related papers (2024-11-15T21:35:21Z) - Stragglers-Aware Low-Latency Synchronous Federated Learning via Layer-Wise Model Updates [71.81037644563217]
Synchronous federated learning (FL) is a popular paradigm for collaborative edge learning.
As some of the devices may have limited computational resources and varying availability, FL latency is highly sensitive to stragglers.
We propose straggler-aware layer-wise federated learning (SALF) that leverages the optimization procedure of NNs via backpropagation to update the global model in a layer-wise fashion.
arXiv Detail & Related papers (2024-03-27T09:14:36Z) - Embedding Alignment for Unsupervised Federated Learning via Smart Data
Exchange [21.789359767103154]
Federated learning (FL) has been recognized as one of the most promising solutions for distributed machine learning (ML)
We develop a novel methodology, Cooperative Federated unsupervised Contrastive Learning (CF-CL), for FL across edge devices with unlabeled datasets.
arXiv Detail & Related papers (2022-08-04T19:26:59Z) - FedHiSyn: A Hierarchical Synchronous Federated Learning Framework for
Resource and Data Heterogeneity [56.82825745165945]
Federated Learning (FL) enables training a global model without sharing the decentralized raw data stored on multiple devices to protect data privacy.
We propose a hierarchical synchronous FL framework, i.e., FedHiSyn, to tackle the problems of straggler effects and outdated models.
We evaluate the proposed framework based on MNIST, EMNIST, CIFAR10 and CIFAR100 datasets and diverse heterogeneous settings of devices.
arXiv Detail & Related papers (2022-06-21T17:23:06Z) - FedCAT: Towards Accurate Federated Learning via Device Concatenation [4.416919766772866]
Federated Learning (FL) enables all the involved devices to train a global model collaboratively without exposing their local data privacy.
For non-IID scenarios, the classification accuracy of FL models decreases drastically due to the weight divergence caused by data heterogeneity.
We introduce a novel FL approach named Fed-Cat that can achieve high model accuracy based on our proposed device selection strategy and device concatenation-based local training method.
arXiv Detail & Related papers (2022-02-23T10:08:43Z) - Gradual Federated Learning with Simulated Annealing [26.956032164461377]
Federated averaging (FedAvg) is a popular federated learning (FL) technique that updates the global model by averaging local models.
In this paper, we propose a new FL technique based on simulated annealing.
We show that SAFL outperforms the conventional FedAvg technique in terms of the convergence speed and the classification accuracy.
arXiv Detail & Related papers (2021-10-11T11:57:56Z) - Rethinking Architecture Design for Tackling Data Heterogeneity in
Federated Learning [53.73083199055093]
We show that attention-based architectures (e.g., Transformers) are fairly robust to distribution shifts.
Our experiments show that replacing convolutional networks with Transformers can greatly reduce catastrophic forgetting of previous devices.
arXiv Detail & Related papers (2021-06-10T21:04:18Z) - Fast-Convergent Federated Learning [82.32029953209542]
Federated learning is a promising solution for distributing machine learning tasks through modern networks of mobile devices.
We propose a fast-convergent federated learning algorithm, called FOLB, which performs intelligent sampling of devices in each round of model training.
arXiv Detail & Related papers (2020-07-26T14:37:51Z) - Think Locally, Act Globally: Federated Learning with Local and Global
Representations [92.68484710504666]
Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
arXiv Detail & Related papers (2020-01-06T12:40:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.