On the Impact of Device and Behavioral Heterogeneity in Federated
Learning
- URL: http://arxiv.org/abs/2102.07500v1
- Date: Mon, 15 Feb 2021 12:04:38 GMT
- Title: On the Impact of Device and Behavioral Heterogeneity in Federated
Learning
- Authors: Ahmed M. Abdelmoniem and Chen-Yu Ho and Pantelis Papageorgiou and
Muhammad Bilal and Marco Canini
- Abstract summary: Federated learning (FL) is becoming a popular paradigm for collaborative learning over distributed, private datasets owned by non-trusting entities.
This paper describes the challenge of performing training over largely heterogeneous datasets, devices, and networks.
We conduct an empirical study spanning close to 1.5K unique configurations on five popular FL benchmarks.
- Score: 5.038980064083677
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is becoming a popular paradigm for collaborative
learning over distributed, private datasets owned by non-trusting entities. FL
has seen successful deployment in production environments, and it has been
adopted in services such as virtual keyboards, auto-completion, item
recommendation, and several IoT applications. However, FL comes with the
challenge of performing training over largely heterogeneous datasets, devices,
and networks that are out of the control of the centralized FL server.
Motivated by this inherent setting, we make a first step towards characterizing
the impact of device and behavioral heterogeneity on the trained model. We
conduct an extensive empirical study spanning close to 1.5K unique
configurations on five popular FL benchmarks. Our analysis shows that these
sources of heterogeneity have a major impact on both model performance and
fairness, thus sheds light on the importance of considering heterogeneity in FL
system design.
Related papers
- Can We Theoretically Quantify the Impacts of Local Updates on the Generalization Performance of Federated Learning? [50.03434441234569]
Federated Learning (FL) has gained significant popularity due to its effectiveness in training machine learning models across diverse sites without requiring direct data sharing.
While various algorithms have shown that FL with local updates is a communication-efficient distributed learning framework, the generalization performance of FL with local updates has received comparatively less attention.
arXiv Detail & Related papers (2024-09-05T19:00:18Z) - Fake It Till Make It: Federated Learning with Consensus-Oriented
Generation [52.82176415223988]
We propose federated learning with consensus-oriented generation (FedCOG)
FedCOG consists of two key components at the client side: complementary data generation and knowledge-distillation-based model training.
Experiments on classical and real-world FL datasets show that FedCOG consistently outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-10T18:49:59Z) - Handling Data Heterogeneity via Architectural Design for Federated
Visual Recognition [16.50490537786593]
We study 19 visual recognition models from five different architectural families on four challenging FL datasets.
Our findings emphasize the importance of architectural design for computer vision tasks in practical scenarios.
arXiv Detail & Related papers (2023-10-23T17:59:16Z) - FedConv: Enhancing Convolutional Neural Networks for Handling Data
Heterogeneity in Federated Learning [34.37155882617201]
Federated learning (FL) is an emerging paradigm in machine learning, where a shared model is collaboratively learned using data from multiple devices.
We systematically investigate the impact of different architectural elements, such as activation functions and normalization layers, on the performance within heterogeneous FL.
Our findings indicate that with strategic architectural modifications, pure CNNs can achieve a level of robustness that either matches or even exceeds that of ViTs.
arXiv Detail & Related papers (2023-10-06T17:57:50Z) - PFL-GAN: When Client Heterogeneity Meets Generative Models in
Personalized Federated Learning [55.930403371398114]
We propose a novel generative adversarial network (GAN) sharing and aggregation strategy for personalized learning (PFL)
PFL-GAN addresses the client heterogeneity in different scenarios. More specially, we first learn the similarity among clients and then develop an weighted collaborative data aggregation.
The empirical results through the rigorous experimentation on several well-known datasets demonstrate the effectiveness of PFL-GAN.
arXiv Detail & Related papers (2023-08-23T22:38:35Z) - An Empirical Study of Federated Learning on IoT-Edge Devices: Resource
Allocation and Heterogeneity [2.055204980188575]
Federated Learning (FL) is a distributed approach in which a single server and multiple clients collaboratively build an ML model without moving data away from clients.
In this study, we systematically conduct extensive experiments on a large network of IoT and edge devices (called IoT-Edge devices) to present FL real-world characteristics.
arXiv Detail & Related papers (2023-05-31T13:16:07Z) - Deep Equilibrium Models Meet Federated Learning [71.57324258813675]
This study explores the problem of Federated Learning (FL) by utilizing the Deep Equilibrium (DEQ) models instead of conventional deep learning networks.
We claim that incorporating DEQ models into the federated learning framework naturally addresses several open problems in FL.
To the best of our knowledge, this study is the first to establish a connection between DEQ models and federated learning.
arXiv Detail & Related papers (2023-05-29T22:51:40Z) - FS-Real: Towards Real-World Cross-Device Federated Learning [60.91678132132229]
Federated Learning (FL) aims to train high-quality models in collaboration with distributed clients while not uploading their local data.
There is still a considerable gap between the flourishing FL research and real-world scenarios, mainly caused by the characteristics of heterogeneous devices and its scales.
We propose an efficient and scalable prototyping system for real-world cross-device FL, FS-Real.
arXiv Detail & Related papers (2023-03-23T15:37:17Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - Characterizing Impacts of Heterogeneity in Federated Learning upon
Large-Scale Smartphone Data [23.67491703843822]
Federated learning (FL) is an emerging, privacy-preserving machine learning paradigm, drawing tremendous attention in academia and industry.
A unique characteristic of FL is heterogeneity, which resides in the various hardware specifications and dynamic states across the participating devices.
We conduct extensive experiments to compare the performance of state-of-the-art FL algorithms under Heterogeneous and Heterogeneous-unaware settings.
arXiv Detail & Related papers (2020-06-12T07:49:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.