Distribution-Aware Mobility-Assisted Decentralized Federated Learning
- URL: http://arxiv.org/abs/2505.18866v1
- Date: Sat, 24 May 2025 20:47:42 GMT
- Title: Distribution-Aware Mobility-Assisted Decentralized Federated Learning
- Authors: Md Farhamdur Reza, Reza Jahani, Richeng Jin, Huaiyu Dai,
- Abstract summary: Decentralized federated learning (DFL) has attracted significant attention due to its scalability and independence from a central server.<n>We show that introducing a small fraction of mobile clients, even with random movement, can significantly improve the accuracy of DFL by facilitating information flow.<n>We propose novel distribution-aware mobility patterns, where mobile clients strategically navigate the network.
- Score: 28.986426579981924
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Decentralized federated learning (DFL) has attracted significant attention due to its scalability and independence from a central server. In practice, some participating clients can be mobile, yet the impact of user mobility on DFL performance remains largely unexplored, despite its potential to facilitate communication and model convergence. In this work, we demonstrate that introducing a small fraction of mobile clients, even with random movement, can significantly improve the accuracy of DFL by facilitating information flow. To further enhance performance, we propose novel distribution-aware mobility patterns, where mobile clients strategically navigate the network, leveraging knowledge of data distributions and static client locations. The proposed moving strategies mitigate the impact of data heterogeneity and boost learning convergence. Extensive experiments validate the effectiveness of induced mobility in DFL and demonstrate the superiority of our proposed mobility patterns over random movement.
Related papers
- MoCFL: Mobile Cluster Federated Learning Framework for Highly Dynamic Network [10.962599830266676]
Frequent fluctuations of client nodes in highly dynamic mobile clusters can lead to significant changes in feature space distribution and data drift.<n>We propose a mobile cluster federated learning framework (MoCFL) to address these issues.<n>MoCFL enhances feature aggregation by introducing an affinity matrix that quantifies the similarity between local feature extractors from different clients.
arXiv Detail & Related papers (2025-03-03T13:59:47Z) - Federated Learning in Mobile Networks: A Comprehensive Case Study on Traffic Forecasting [2.661771915992631]
Federated learning (FL) is a distributed and privacy-preserving solution to foster collaboration among different sites.<n>In this paper, we study the potential benefits of FL in telecommunications through a case study on federated traffic forecasting using real-world data from base stations (BSs) in Barcelona (Spain)<n>The performed evaluation is based on both prediction accuracy and sustainability, thus showcasing the environmental impact of employed FL algorithms in various settings.
arXiv Detail & Related papers (2024-12-05T11:32:14Z) - FoMo: A Foundation Model for Mobile Traffic Forecasting with Diffusion Model [5.96737388771505]
We propose an innovative Foundation model for Mobile traffic forecasting (FoMo)<n>FoMo aims to handle diverse forecasting tasks of short/long-term predictions and distribution generation across multiple cities to support network planning and optimization.<n>FoMo combines diffusion models and transformers, where various universality masks are proposed to enable FoMo to learn intrinsic features of different tasks.
arXiv Detail & Related papers (2024-10-20T07:32:16Z) - Data-Heterogeneous Hierarchical Federated Learning with Mobility [20.482704508355905]
Federated learning enables distributed training of machine learning (ML) models across multiple devices.
We consider a data-heterogeneous HFL scenario with mobility, mainly targeting vehicular networks.
We show that mobility can indeed improve the model accuracy by up to 15.1% when training a convolutional neural network.
arXiv Detail & Related papers (2023-06-19T04:22:18Z) - PS-FedGAN: An Efficient Federated Learning Framework Based on Partially
Shared Generative Adversarial Networks For Data Privacy [56.347786940414935]
Federated Learning (FL) has emerged as an effective learning paradigm for distributed computation.
This work proposes a novel FL framework that requires only partial GAN model sharing.
Named as PS-FedGAN, this new framework enhances the GAN releasing and training mechanism to address heterogeneous data distributions.
arXiv Detail & Related papers (2023-05-19T05:39:40Z) - Improving Privacy-Preserving Vertical Federated Learning by Efficient Communication with ADMM [62.62684911017472]
Federated learning (FL) enables devices to jointly train shared models while keeping the training data local for privacy purposes.
We introduce a VFL framework with multiple heads (VIM), which takes the separate contribution of each client into account.
VIM achieves significantly higher performance and faster convergence compared with the state-of-the-art.
arXiv Detail & Related papers (2022-07-20T23:14:33Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z) - Federated Robustness Propagation: Sharing Adversarial Robustness in
Federated Learning [98.05061014090913]
Federated learning (FL) emerges as a popular distributed learning schema that learns from a set of participating users without requiring raw data to be shared.
adversarial training (AT) provides a sound solution for centralized learning, extending its usage for FL users has imposed significant challenges.
We show that existing FL techniques cannot effectively propagate adversarial robustness among non-iid users.
We propose a simple yet effective propagation approach that transfers robustness through carefully designed batch-normalization statistics.
arXiv Detail & Related papers (2021-06-18T15:52:33Z) - To Talk or to Work: Flexible Communication Compression for Energy
Efficient Federated Learning over Heterogeneous Mobile Edge Devices [78.38046945665538]
federated learning (FL) over massive mobile edge devices opens new horizons for numerous intelligent mobile applications.
FL imposes huge communication and computation burdens on participating devices due to periodical global synchronization and continuous local training.
We develop a convergence-guaranteed FL algorithm enabling flexible communication compression.
arXiv Detail & Related papers (2020-12-22T02:54:18Z) - Communication Efficient Federated Learning with Energy Awareness over
Wireless Networks [51.645564534597625]
In federated learning (FL), the parameter server and the mobile devices share the training parameters over wireless links.
We adopt the idea of SignSGD in which only the signs of the gradients are exchanged.
Two optimization problems are formulated and solved, which optimize the learning performance.
Considering that the data may be distributed across the mobile devices in a highly uneven fashion in FL, a sign-based algorithm is proposed.
arXiv Detail & Related papers (2020-04-15T21:25:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.