Hierarchical Federated Learning in Multi-hop Cluster-Based VANETs
- URL: http://arxiv.org/abs/2401.10361v1
- Date: Thu, 18 Jan 2024 20:05:34 GMT
- Title: Hierarchical Federated Learning in Multi-hop Cluster-Based VANETs
- Authors: M. Saeid HaghighiFard and Sinem Coleri
- Abstract summary: This paper introduces a novel framework for hierarchical federated learning (HFL) over multi-hop clustering-based VANET.
The proposed method utilizes a weighted combination of the average relative speed and cosine similarity of FL model parameters as a clustering metric.
Through extensive simulations, the proposed hierarchical federated learning over clustered VANET has been demonstrated to improve accuracy and convergence time significantly.
- Score: 12.023861154677205
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The usage of federated learning (FL) in Vehicular Ad hoc Networks (VANET) has
garnered significant interest in research due to the advantages of reducing
transmission overhead and protecting user privacy by communicating local
dataset gradients instead of raw data. However, implementing FL in VANETs faces
challenges, including limited communication resources, high vehicle mobility,
and the statistical diversity of data distributions. In order to tackle these
issues, this paper introduces a novel framework for hierarchical federated
learning (HFL) over multi-hop clustering-based VANET. The proposed method
utilizes a weighted combination of the average relative speed and cosine
similarity of FL model parameters as a clustering metric to consider both data
diversity and high vehicle mobility. This metric ensures convergence with
minimum changes in cluster heads while tackling the complexities associated
with non-independent and identically distributed (non-IID) data scenarios.
Additionally, the framework includes a novel mechanism to manage seamless
transitions of cluster heads (CHs), followed by transferring the most recent FL
model parameter to the designated CH. Furthermore, the proposed approach
considers the option of merging CHs, aiming to reduce their count and,
consequently, mitigate associated overhead. Through extensive simulations, the
proposed hierarchical federated learning over clustered VANET has been
demonstrated to improve accuracy and convergence time significantly while
maintaining an acceptable level of packet overhead compared to previously
proposed clustering algorithms and non-clustered VANET.
Related papers
- Boosting the Performance of Decentralized Federated Learning via Catalyst Acceleration [66.43954501171292]
We introduce Catalyst Acceleration and propose an acceleration Decentralized Federated Learning algorithm called DFedCata.
DFedCata consists of two main components: the Moreau envelope function, which addresses parameter inconsistencies, and Nesterov's extrapolation step, which accelerates the aggregation phase.
Empirically, we demonstrate the advantages of the proposed algorithm in both convergence speed and generalization performance on CIFAR10/100 with various non-iid data distributions.
arXiv Detail & Related papers (2024-10-09T06:17:16Z) - Dual-Segment Clustering Strategy for Hierarchical Federated Learning in Heterogeneous Wireless Environments [22.35256018841889]
Non-independent and identically distributed (Non- IID) data adversely affects federated learning (FL)
This paper proposes a novel dual-segment clustering (DSC) strategy that jointly addresses communication and data heterogeneity in FL.
The convergence analysis and experimental results show that the DSC strategy can improve the convergence rate of wireless FL.
arXiv Detail & Related papers (2024-05-15T11:46:47Z) - Rethinking Clustered Federated Learning in NOMA Enhanced Wireless
Networks [60.09912912343705]
This study explores the benefits of integrating the novel clustered federated learning (CFL) approach with non-independent and identically distributed (non-IID) datasets.
A detailed theoretical analysis of the generalization gap that measures the degree of non-IID in the data distribution is presented.
Solutions to address the challenges posed by non-IID conditions are proposed with the analysis of the properties.
arXiv Detail & Related papers (2024-03-05T17:49:09Z) - Clustered Data Sharing for Non-IID Federated Learning over Wireless
Networks [39.80420645943706]
Federated Learning (FL) is a distributed machine learning approach to leverage data from the Internet of Things (IoT)
Current FL algorithms face the challenges of non-independent and identically distributed (non-IID) data, which causes high communication costs and model accuracy declines.
We propose a clustered data sharing framework which spares the partial data from cluster heads to credible associates through device-to-device (D2D) communication.
arXiv Detail & Related papers (2023-02-17T07:11:02Z) - FedRC: Tackling Diverse Distribution Shifts Challenge in Federated Learning by Robust Clustering [4.489171618387544]
Federated Learning (FL) is a machine learning paradigm that safeguards privacy by retaining client data on edge devices.
In this paper, we identify the learning challenges posed by the simultaneous occurrence of diverse distribution shifts.
We propose a novel clustering algorithm framework, dubbed as FedRC, which adheres to our proposed clustering principle.
arXiv Detail & Related papers (2023-01-29T06:50:45Z) - Clustered Federated Learning based on Nonconvex Pairwise Fusion [22.82565500426576]
We introduce a novel clustered FL setting called Fusion Clustering (FPFC)
FPFC can perform partial updates at each communication allows parallel computation with variable workload.
We also propose a new practical strategy for FLFC with general losses and robustness.
arXiv Detail & Related papers (2022-11-08T13:04:56Z) - Multi-view Multi-label Anomaly Network Traffic Classification based on
MLP-Mixer Neural Network [55.21501819988941]
Existing network traffic classification based on convolutional neural networks (CNNs) often emphasizes local patterns of traffic data while ignoring global information associations.
We propose an end-to-end network traffic classification method.
arXiv Detail & Related papers (2022-10-30T01:52:05Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - Communication-Efficient Hierarchical Federated Learning for IoT
Heterogeneous Systems with Imbalanced Data [42.26599494940002]
Federated learning (FL) is a distributed learning methodology that allows multiple nodes to cooperatively train a deep learning model.
This paper studies the potential of hierarchical FL in IoT heterogeneous systems.
It proposes an optimized solution for user assignment and resource allocation on multiple edge nodes.
arXiv Detail & Related papers (2021-07-14T08:32:39Z) - Clustered Federated Learning via Generalized Total Variation
Minimization [83.26141667853057]
We study optimization methods to train local (or personalized) models for local datasets with a decentralized network structure.
Our main conceptual contribution is to formulate federated learning as total variation minimization (GTV)
Our main algorithmic contribution is a fully decentralized federated learning algorithm.
arXiv Detail & Related papers (2021-05-26T18:07:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.