Spatio-Temporal Federated Learning for Massive Wireless Edge Networks
- URL: http://arxiv.org/abs/2110.14578v1
- Date: Wed, 27 Oct 2021 16:46:45 GMT
- Title: Spatio-Temporal Federated Learning for Massive Wireless Edge Networks
- Authors: Chun-Hung Liu, Kai-Ten Feng, Lu Wei, Yu Luo
- Abstract summary: An edge server and numerous mobile devices (clients) jointly learn a global model without transporting huge amount of data collected by the mobile devices to the edge server.
The proposed FL approach exploits spatial and temporal correlations between learning updates from different mobile devices scheduled to join STFL in various trainings.
An analytical framework of STFL is proposed and employed to study the learning capability of STFL via its convergence performance.
- Score: 23.389249751372393
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a novel approach to conduct highly efficient federated
learning (FL) over a massive wireless edge network, where an edge server and
numerous mobile devices (clients) jointly learn a global model without
transporting the huge amount of data collected by the mobile devices to the
edge server. The proposed FL approach is referred to as spatio-temporal FL
(STFL), which jointly exploits the spatial and temporal correlations between
the learning updates from different mobile devices scheduled to join STFL in
various training epochs. The STFL model not only represents the realistic
intermittent learning behavior from the edge server to the mobile devices due
to data delivery outage, but also features a mechanism of compensating loss
learning updates in order to mitigate the impacts of intermittent learning. An
analytical framework of STFL is proposed and employed to study the learning
capability of STFL via its convergence performance. In particular, we have
assessed the impact of data delivery outage, intermittent learning mitigation,
and statistical heterogeneity of datasets on the convergence performance of
STFL. The results provide crucial insights into the design and analysis of STFL
based wireless networks.
Related papers
- Stragglers-Aware Low-Latency Synchronous Federated Learning via Layer-Wise Model Updates [71.81037644563217]
Synchronous federated learning (FL) is a popular paradigm for collaborative edge learning.
As some of the devices may have limited computational resources and varying availability, FL latency is highly sensitive to stragglers.
We propose straggler-aware layer-wise federated learning (SALF) that leverages the optimization procedure of NNs via backpropagation to update the global model in a layer-wise fashion.
arXiv Detail & Related papers (2024-03-27T09:14:36Z) - Edge-assisted U-Shaped Split Federated Learning with Privacy-preserving
for Internet of Things [4.68267059122563]
We present an innovative Edge-assisted U-Shaped Split Federated Learning (EUSFL) framework, which harnesses the high-performance capabilities of edge servers.
In this framework, we leverage Federated Learning (FL) to enable data holders to collaboratively train models without sharing their data.
We also propose a novel noise mechanism called LabelDP to ensure that data features and labels can securely resist reconstruction attacks.
arXiv Detail & Related papers (2023-11-08T05:14:41Z) - Adaptive Model Pruning and Personalization for Federated Learning over
Wireless Networks [72.59891661768177]
Federated learning (FL) enables distributed learning across edge devices while protecting data privacy.
We consider a FL framework with partial model pruning and personalization to overcome these challenges.
This framework splits the learning model into a global part with model pruning shared with all devices to learn data representations and a personalized part to be fine-tuned for a specific device.
arXiv Detail & Related papers (2023-09-04T21:10:45Z) - Analysis and Optimization of Wireless Federated Learning with Data
Heterogeneity [72.85248553787538]
This paper focuses on performance analysis and optimization for wireless FL, considering data heterogeneity, combined with wireless resource allocation.
We formulate the loss function minimization problem, under constraints on long-term energy consumption and latency, and jointly optimize client scheduling, resource allocation, and the number of local training epochs (CRE)
Experiments on real-world datasets demonstrate that the proposed algorithm outperforms other benchmarks in terms of the learning accuracy and energy consumption.
arXiv Detail & Related papers (2023-08-04T04:18:01Z) - Time-sensitive Learning for Heterogeneous Federated Edge Intelligence [52.83633954857744]
We investigate real-time machine learning in a federated edge intelligence (FEI) system.
FEI systems exhibit heterogenous communication and computational resource distribution.
We propose a time-sensitive federated learning (TS-FL) framework to minimize the overall run-time for collaboratively training a shared ML model.
arXiv Detail & Related papers (2023-01-26T08:13:22Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - To Talk or to Work: Delay Efficient Federated Learning over Mobile Edge
Devices [13.318419040823088]
Mobile devices collaborate to train a model based on their own data under the coordination of a central server.
Without the central availability of data, computing nodes need to communicate the model updates often to attain convergence.
We propose a delay-efficient FL mechanism that reduces the overall time (consisting of both the computation and communication latencies) and communication rounds required for the model to converge.
arXiv Detail & Related papers (2021-11-01T00:35:32Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z) - Reconfigurable Intelligent Surface Enabled Federated Learning: A Unified
Communication-Learning Design Approach [30.1988598440727]
We develop a unified communication-learning optimization problem to jointly optimize device selection, over-the-air transceiver design, and RIS configuration.
Numerical experiments show that the proposed design achieves substantial learning accuracy improvement compared with the state-of-the-art approaches.
arXiv Detail & Related papers (2020-11-20T08:54:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.