Navigating High-Degree Heterogeneity: Federated Learning in Aerial and Space Networks
- URL: http://arxiv.org/abs/2406.17951v2
- Date: Tue, 17 Sep 2024 19:14:33 GMT
- Title: Navigating High-Degree Heterogeneity: Federated Learning in Aerial and Space Networks
- Authors: Fan Dong, Henry Leung, Steve Drew,
- Abstract summary: Federated learning offers a compelling solution to the challenges of networking and data privacy within aerial and space networks.
In this paper, we explore the influence of heterogeneity on class imbalance, which diminishes performance in ASNs-based federated learning.
- Score: 8.766411351797885
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning offers a compelling solution to the challenges of networking and data privacy within aerial and space networks by utilizing vast private edge data and computing capabilities accessible through drones, balloons, and satellites. While current research has focused on optimizing the learning process, computing efficiency, and minimizing communication overhead, the heterogeneity issue and class imbalance remain a significant barrier to rapid model convergence. In this paper, we explore the influence of heterogeneity on class imbalance, which diminishes performance in Aerial and Space Networks (ASNs)-based federated learning. We illustrate the correlation between heterogeneity and class imbalance within grouped data and show how constraints such as battery life exacerbate the class imbalance challenge. Our findings indicate that ASNs-based FL faces heightened class imbalance issues even with similar levels of heterogeneity compared to other scenarios. Finally, we analyze the impact of varying degrees of heterogeneity on FL training and evaluate the efficacy of current state-of-the-art algorithms under these conditions. Our results reveal that the heterogeneity challenge is more pronounced in ASNs-based federated learning and that prevailing algorithms often fail to effectively address high levels of heterogeneity.
Related papers
- FedSat: A Statistical Aggregation Approach for Class Imbalaced Clients in Federated Learning [2.5628953713168685]
Federated learning (FL) has emerged as a promising paradigm for privacy-preserving distributed machine learning.
This paper introduces FedSat, a novel FL approach designed to tackle various forms of data heterogeneity simultaneously.
arXiv Detail & Related papers (2024-07-04T11:50:24Z) - FedShift: Tackling Dual Heterogeneity Problem of Federated Learning via
Weight Shift Aggregation [6.3842184099869295]
Federated Learning (FL) offers a compelling method for training machine learning models with a focus on preserving data privacy.
The presence of system heterogeneity and statistical heterogeneity, recognized challenges in FL, arises from the diversity of client hardware, network, and dataset distribution.
This paper introduces FedShift, a novel algorithm designed to enhance both the training speed and the models' accuracy in a dual heterogeneous scenario.
arXiv Detail & Related papers (2024-02-02T00:03:51Z) - Hierarchical Over-the-Air Federated Learning with Awareness of
Interference and Data Heterogeneity [3.8798345704175534]
We introduce a scalable transmission scheme that efficiently uses a single wireless resource through over-the-air computation.
We show that despite the interference and the data heterogeneity, the proposed scheme achieves high learning accuracy and can significantly outperform the conventional hierarchical algorithm.
arXiv Detail & Related papers (2024-01-02T21:43:01Z) - FedFN: Feature Normalization for Alleviating Data Heterogeneity Problem
in Federated Learning [29.626725039794383]
We introduce Federated Averaging with Feature Normalization Update (FedFN), a straightforward learning method.
We demonstrate the superior performance of FedFN through extensive experiments, even when applied to pretrained ResNet18.
arXiv Detail & Related papers (2023-11-22T09:37:33Z) - Generalizable Heterogeneous Federated Cross-Correlation and Instance
Similarity Learning [60.058083574671834]
This paper presents a novel FCCL+, federated correlation and similarity learning with non-target distillation.
For heterogeneous issue, we leverage irrelevant unlabeled public data for communication.
For catastrophic forgetting in local updating stage, FCCL+ introduces Federated Non Target Distillation.
arXiv Detail & Related papers (2023-09-28T09:32:27Z) - UNIDEAL: Curriculum Knowledge Distillation Federated Learning [17.817181326740698]
Federated Learning (FL) has emerged as a promising approach to enable collaborative learning among multiple clients.
In this paper, we present UNI, a novel FL algorithm specifically designed to tackle the challenges of cross-domain scenarios.
Our results demonstrate that UNI achieves superior performance in terms of both model accuracy and communication efficiency.
arXiv Detail & Related papers (2023-09-16T11:30:29Z) - Analysis and Optimization of Wireless Federated Learning with Data
Heterogeneity [72.85248553787538]
This paper focuses on performance analysis and optimization for wireless FL, considering data heterogeneity, combined with wireless resource allocation.
We formulate the loss function minimization problem, under constraints on long-term energy consumption and latency, and jointly optimize client scheduling, resource allocation, and the number of local training epochs (CRE)
Experiments on real-world datasets demonstrate that the proposed algorithm outperforms other benchmarks in terms of the learning accuracy and energy consumption.
arXiv Detail & Related papers (2023-08-04T04:18:01Z) - Federated Compositional Deep AUC Maximization [58.25078060952361]
We develop a novel federated learning method for imbalanced data by directly optimizing the area under curve (AUC) score.
To the best of our knowledge, this is the first work to achieve such favorable theoretical results.
arXiv Detail & Related papers (2023-04-20T05:49:41Z) - Heterogeneous Federated Learning via Grouped Sequential-to-Parallel
Training [60.892342868936865]
Federated learning (FL) is a rapidly growing privacy-preserving collaborative machine learning paradigm.
We propose a data heterogeneous-robust FL approach, FedGSP, to address this challenge.
We show that FedGSP improves the accuracy by 3.7% on average compared with seven state-of-the-art approaches.
arXiv Detail & Related papers (2022-01-31T03:15:28Z) - Quasi-Global Momentum: Accelerating Decentralized Deep Learning on
Heterogeneous Data [77.88594632644347]
Decentralized training of deep learning models is a key element for enabling data privacy and on-device learning over networks.
In realistic learning scenarios, the presence of heterogeneity across different clients' local datasets poses an optimization challenge.
We propose a novel momentum-based method to mitigate this decentralized training difficulty.
arXiv Detail & Related papers (2021-02-09T11:27:14Z) - Heteroskedastic and Imbalanced Deep Learning with Adaptive
Regularization [55.278153228758434]
Real-world datasets are heteroskedastic and imbalanced.
Addressing heteroskedasticity and imbalance simultaneously is under-explored.
We propose a data-dependent regularization technique for heteroskedastic datasets.
arXiv Detail & Related papers (2020-06-29T01:09:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.