Data-Heterogeneous Hierarchical Federated Learning with Mobility
- URL: http://arxiv.org/abs/2306.10692v1
- Date: Mon, 19 Jun 2023 04:22:18 GMT
- Title: Data-Heterogeneous Hierarchical Federated Learning with Mobility
- Authors: Tan Chen, Jintao Yan, Yuxuan Sun, Sheng Zhou, Deniz Gunduz, Zhisheng
Niu
- Abstract summary: Federated learning enables distributed training of machine learning (ML) models across multiple devices.
We consider a data-heterogeneous HFL scenario with mobility, mainly targeting vehicular networks.
We show that mobility can indeed improve the model accuracy by up to 15.1% when training a convolutional neural network.
- Score: 20.482704508355905
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning enables distributed training of machine learning (ML)
models across multiple devices in a privacy-preserving manner. Hierarchical
federated learning (HFL) is further proposed to meet the requirements of both
latency and coverage. In this paper, we consider a data-heterogeneous HFL
scenario with mobility, mainly targeting vehicular networks. We derive the
convergence upper bound of HFL with respect to mobility and data heterogeneity,
and analyze how mobility impacts the performance of HFL. While mobility is
considered as a challenge from a communication point of view, our goal here is
to exploit mobility to improve the learning performance by mitigating data
heterogeneity. Simulation results verify the analysis and show that mobility
can indeed improve the model accuracy by up to 15.1\% when training a
convolutional neural network on the CIFAR-10 dataset using HFL.
Related papers
- Reconstructing Human Mobility Pattern: A Semi-Supervised Approach for Cross-Dataset Transfer Learning [10.864774173935535]
We have developed a model that reconstructs and learns human mobility patterns by focusing on semantic activity chains.
We introduce a semi-supervised iterative transfer learning algorithm to adapt models to diverse geographical contexts.
arXiv Detail & Related papers (2024-10-03T20:29:56Z) - Mobility Accelerates Learning: Convergence Analysis on Hierarchical
Federated Learning in Vehicular Networks [15.282996586821415]
We show that mobility influences the convergence speed by both fusing the edge data and shuffling the edge models.
Mobility increases the model accuracy of HFL by up to 15.1% when training a convolutional neural network.
arXiv Detail & Related papers (2024-01-18T00:09:54Z) - Adaptive Model Pruning and Personalization for Federated Learning over
Wireless Networks [72.59891661768177]
Federated learning (FL) enables distributed learning across edge devices while protecting data privacy.
We consider a FL framework with partial model pruning and personalization to overcome these challenges.
This framework splits the learning model into a global part with model pruning shared with all devices to learn data representations and a personalized part to be fine-tuned for a specific device.
arXiv Detail & Related papers (2023-09-04T21:10:45Z) - Communication Resources Constrained Hierarchical Federated Learning for
End-to-End Autonomous Driving [67.78611905156808]
This paper proposes an optimization-based Communication Resource Constrained Hierarchical Federated Learning framework.
Results show that the proposed CRCHFL both accelerates the convergence rate and enhances the generalization of federated learning autonomous driving model.
arXiv Detail & Related papers (2023-06-28T12:44:59Z) - Automated Federated Learning in Mobile Edge Networks -- Fast Adaptation
and Convergence [83.58839320635956]
Federated Learning (FL) can be used in mobile edge networks to train machine learning models in a distributed manner.
Recent FL has been interpreted within a Model-Agnostic Meta-Learning (MAML) framework, which brings FL significant advantages in fast adaptation and convergence over heterogeneous datasets.
This paper addresses how much benefit MAML brings to FL and how to maximize such benefit over mobile edge networks.
arXiv Detail & Related papers (2023-03-23T02:42:10Z) - Time-sensitive Learning for Heterogeneous Federated Edge Intelligence [52.83633954857744]
We investigate real-time machine learning in a federated edge intelligence (FEI) system.
FEI systems exhibit heterogenous communication and computational resource distribution.
We propose a time-sensitive federated learning (TS-FL) framework to minimize the overall run-time for collaboratively training a shared ML model.
arXiv Detail & Related papers (2023-01-26T08:13:22Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - Towards Federated Learning on Time-Evolving Heterogeneous Data [13.080665001587281]
Federated Learning (FL) is an emerging learning paradigm that preserves privacy by ensuring client data locality on edge devices.
Despite recent research efforts on improving the optimization of heterogeneous data, the impact of time-evolving heterogeneous data in real-world scenarios has not been well studied.
We propose Continual Federated Learning (CFL), a flexible framework, to capture the time-evolving heterogeneity of FL.
arXiv Detail & Related papers (2021-12-25T14:58:52Z) - Spatio-Temporal Federated Learning for Massive Wireless Edge Networks [23.389249751372393]
An edge server and numerous mobile devices (clients) jointly learn a global model without transporting huge amount of data collected by the mobile devices to the edge server.
The proposed FL approach exploits spatial and temporal correlations between learning updates from different mobile devices scheduled to join STFL in various trainings.
An analytical framework of STFL is proposed and employed to study the learning capability of STFL via its convergence performance.
arXiv Detail & Related papers (2021-10-27T16:46:45Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.