Strategic Client Selection to Address Non-IIDness in HAPS-enabled FL
Networks
- URL: http://arxiv.org/abs/2401.05308v1
- Date: Wed, 10 Jan 2024 18:22:00 GMT
- Title: Strategic Client Selection to Address Non-IIDness in HAPS-enabled FL
Networks
- Authors: Amin Farajzadeh, Animesh Yadav, Halim Yanikomeroglu
- Abstract summary: This study introduces a client selection strategy tailored to address non-IIDness in client data distributions.
By strategically selecting clients whose data exhibit similar patterns for participation in FL training, our approach fosters a more uniform and representative data distribution.
Our simulations demonstrate that this targeted client selection methodology significantly reduces the training loss of FL models in HAPS networks.
- Score: 24.10349383347469
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The deployment of federated learning (FL) within vertical heterogeneous
networks, such as those enabled by high-altitude platform station (HAPS),
offers the opportunity to engage a wide array of clients, each endowed with
distinct communication and computational capabilities. This diversity not only
enhances the training accuracy of FL models but also hastens their convergence.
Yet, applying FL in these expansive networks presents notable challenges,
particularly the significant non-IIDness in client data distributions. Such
data heterogeneity often results in slower convergence rates and reduced
effectiveness in model training performance. Our study introduces a client
selection strategy tailored to address this issue, leveraging user network
traffic behaviour. This strategy involves the prediction and classification of
clients based on their network usage patterns while prioritizing user privacy.
By strategically selecting clients whose data exhibit similar patterns for
participation in FL training, our approach fosters a more uniform and
representative data distribution across the network. Our simulations
demonstrate that this targeted client selection methodology significantly
reduces the training loss of FL models in HAPS networks, thereby effectively
tackling a crucial challenge in implementing large-scale FL systems.
Related papers
- Seamless Integration: Sampling Strategies in Federated Learning Systems [0.0]
Federated Learning (FL) represents a paradigm shift in the field of machine learning.
The seamless integration of new clients is imperative to sustain and enhance the performance of FL systems.
This paper outlines strategies for effective client selection strategies and solutions for ensuring system scalability and stability.
arXiv Detail & Related papers (2024-08-18T17:16:49Z) - Efficient Model Compression for Hierarchical Federated Learning [10.37403547348343]
Federated learning (FL) has garnered significant attention due to its capacity to preserve privacy within distributed learning systems.
This paper introduces a novel hierarchical FL framework that integrates the benefits of clustered FL and model compression.
arXiv Detail & Related papers (2024-05-27T12:17:47Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FLASH: Federated Learning Across Simultaneous Heterogeneities [54.80435317208111]
FLASH(Federated Learning Across Simultaneous Heterogeneities) is a lightweight and flexible client selection algorithm.
It outperforms state-of-the-art FL frameworks under extensive sources of Heterogeneities.
It achieves substantial and consistent improvements over state-of-the-art baselines.
arXiv Detail & Related papers (2024-02-13T20:04:39Z) - Contrastive encoder pre-training-based clustered federated learning for
heterogeneous data [17.580390632874046]
Federated learning (FL) enables distributed clients to collaboratively train a global model while preserving their data privacy.
We propose contrastive pre-training-based clustered federated learning (CP-CFL) to improve the model convergence and overall performance of FL systems.
arXiv Detail & Related papers (2023-11-28T05:44:26Z) - FLrce: Resource-Efficient Federated Learning with Early-Stopping Strategy [7.963276533979389]
Federated Learning (FL) achieves great popularity in the Internet of Things (IoT)
We present FLrce, an efficient FL framework with a relationship-based client selection and early-stopping strategy.
Experiment results show that, compared with existing efficient FL frameworks, FLrce improves the computation and communication efficiency by at least 30% and 43% respectively.
arXiv Detail & Related papers (2023-10-15T10:13:44Z) - Effectively Heterogeneous Federated Learning: A Pairing and Split
Learning Based Approach [16.093068118849246]
This paper presents a novel split federated learning (SFL) framework that pairs clients with different computational resources.
A greedy algorithm is proposed by reconstructing the optimization of training latency as a graph edge selection problem.
Simulation results show the proposed method can significantly improve the FL training speed and achieve high performance.
arXiv Detail & Related papers (2023-08-26T11:10:54Z) - PS-FedGAN: An Efficient Federated Learning Framework Based on Partially
Shared Generative Adversarial Networks For Data Privacy [56.347786940414935]
Federated Learning (FL) has emerged as an effective learning paradigm for distributed computation.
This work proposes a novel FL framework that requires only partial GAN model sharing.
Named as PS-FedGAN, this new framework enhances the GAN releasing and training mechanism to address heterogeneous data distributions.
arXiv Detail & Related papers (2023-05-19T05:39:40Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - FL Games: A federated learning framework for distribution shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL Games, a game-theoretic framework for federated learning for learning causal features that are invariant across clients.
arXiv Detail & Related papers (2022-05-23T07:51:45Z) - Dynamic Attention-based Communication-Efficient Federated Learning [85.18941440826309]
Federated learning (FL) offers a solution to train a global machine learning model.
FL suffers performance degradation when client data distribution is non-IID.
We propose a new adaptive training algorithm $textttAdaFL$ to combat this degradation.
arXiv Detail & Related papers (2021-08-12T14:18:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.