A Survey on Participant Selection for Federated Learning in Mobile
Networks
- URL: http://arxiv.org/abs/2207.03681v1
- Date: Fri, 8 Jul 2022 04:22:48 GMT
- Title: A Survey on Participant Selection for Federated Learning in Mobile
Networks
- Authors: Behnaz Soltani, Venus Haghighi, Adnan Mahmood, Quan Z. Sheng, Lina Yao
- Abstract summary: Federated Learning (FL) is an efficient distributed machine learning paradigm that employs private datasets in a privacy-preserving manner.
Due to limited communication bandwidth and unstable availability of such devices in a mobile network, only a fraction of end devices can be selected in each round.
- Score: 47.88372677863646
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) is an efficient distributed machine learning paradigm
that employs private datasets in a privacy-preserving manner. The main
challenges of FL is that end devices usually possess various computation and
communication capabilities and their training data are not independent and
identically distributed (non-IID). Due to limited communication bandwidth and
unstable availability of such devices in a mobile network, only a fraction of
end devices (also referred to as the participants or clients in a FL process)
can be selected in each round. Hence, it is of paramount importance to utilize
an efficient participant selection scheme to maximize the performance of FL
including final model accuracy and training time. In this paper, we provide a
review of participant selection techniques for FL. First, we introduce FL and
highlight the main challenges during participant selection. Then, we review the
existing studies and categorize them based on their solutions. Finally, we
provide some future directions on participant selection for FL based on our
analysis of the state-of-the-art in this topic area.
Related papers
- FLIPS: Federated Learning using Intelligent Participant Selection [4.395908640091141]
FLIPS clusters parties involved in an FL training job based on the label distribution of their data apriori, and during FL training, ensures that each cluster is equitably represented in the participants selected.
We demonstrate that FLIPS significantly improves convergence, achieving higher accuracy by 17 - 20 % with 20 - 60 % lower communication costs, and these benefits endure in the presence of straggler participants.
arXiv Detail & Related papers (2023-08-07T20:28:22Z) - Ed-Fed: A generic federated learning framework with resource-aware
client selection for edge devices [0.6875312133832078]
Federated learning (FL) has evolved as a prominent method for edge devices to cooperatively create a unified prediction model.
Despite numerous research frameworks for simulating FL algorithms, they do not facilitate comprehensive deployment for automatic speech recognition tasks.
This is where Ed-Fed, a comprehensive and generic FL framework, comes in as a foundation for future practical FL system research.
arXiv Detail & Related papers (2023-07-14T07:19:20Z) - DPP-based Client Selection for Federated Learning with Non-IID Data [97.1195165400568]
This paper proposes a client selection (CS) method to tackle the communication bottleneck of federated learning (FL)
We first analyze the effect of CS in FL and show that FL training can be accelerated by adequately choosing participants to diversify the training dataset in each round of training.
We leverage data profiling and determinantal point process (DPP) sampling techniques to develop an algorithm termed Federated Learning with DPP-based Participant Selection (FL-DP$3$S)
arXiv Detail & Related papers (2023-03-30T13:14:54Z) - Online Data Selection for Federated Learning with Limited Storage [53.46789303416799]
Federated Learning (FL) has been proposed to achieve distributed machine learning among networked devices.
The impact of on-device storage on the performance of FL is still not explored.
In this work, we take the first step to consider the online data selection for FL with limited on-device storage.
arXiv Detail & Related papers (2022-09-01T03:27:33Z) - Cross-Silo Federated Learning: Challenges and Opportunities [30.351077030186104]
Federated learning (FL) enables the training of machine learning models from multiple clients while keeping the data distributed and private.
Based on the participating clients and the model training scale, federated learning can be classified into two types: cross-device FL where clients are typically mobile devices and the client number can reach up to a scale of millions; cross-silo FL where clients are organizations or companies and the client number is usually small (e.g., within a hundred)
arXiv Detail & Related papers (2022-06-26T19:49:41Z) - FLAME: Federated Learning Across Multi-device Environments [9.810211000961647]
Federated Learning (FL) enables distributed training of machine learning models while keeping personal data on user devices private.
We propose FLAME, a user-centered FL training approach to counter statistical and system heterogeneity in multi-device environments.
Our experiment results show that FLAME outperforms various baselines by 4.8-33.8% higher F-1 score, 1.02-2.86x greater energy efficiency, and up to 2.02x speedup in convergence.
arXiv Detail & Related papers (2022-02-17T22:23:56Z) - An Incentive Mechanism for Federated Learning in Wireless Cellular
network: An Auction Approach [75.08185720590748]
Federated Learning (FL) is a distributed learning framework that can deal with the distributed issue in machine learning.
In this paper, we consider a FL system that involves one base station (BS) and multiple mobile users.
We formulate the incentive mechanism between the BS and mobile users as an auction game where the BS is an auctioneer and the mobile users are the sellers.
arXiv Detail & Related papers (2020-09-22T01:50:39Z) - Wireless Communications for Collaborative Federated Learning [160.82696473996566]
Internet of Things (IoT) devices may not be able to transmit their collected data to a central controller for training machine learning models.
Google's seminal FL algorithm requires all devices to be directly connected with a central controller.
This paper introduces a novel FL framework, called collaborative FL (CFL), which enables edge devices to implement FL with less reliance on a central controller.
arXiv Detail & Related papers (2020-06-03T20:00:02Z) - Prophet: Proactive Candidate-Selection for Federated Learning by
Predicting the Qualities of Training and Reporting Phases [66.01459702625064]
In 5G networks, the training latency is still an obstacle preventing Federated Learning (FL) from being largely adopted.
One of the most fundamental problems that lead to large latency is the bad candidate-selection for FL.
In this paper, we study the proactive candidate-selection for FL in this paper.
arXiv Detail & Related papers (2020-02-03T06:40:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.