Federated Learning With Quantized Global Model Updates
- URL: http://arxiv.org/abs/2006.10672v2
- Date: Wed, 7 Oct 2020 01:15:06 GMT
- Title: Federated Learning With Quantized Global Model Updates
- Authors: Mohammad Mohammadi Amiri, Deniz Gunduz, Sanjeev R. Kulkarni, H.
Vincent Poor
- Abstract summary: We study federated learning, which enables mobile devices to utilize their local datasets to train a global model.
We introduce a lossy FL (LFL) algorithm, in which both the global model and the local model updates are quantized before being transmitted.
- Score: 84.55126371346452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study federated learning (FL), which enables mobile devices to utilize
their local datasets to collaboratively train a global model with the help of a
central server, while keeping data localized. At each iteration, the server
broadcasts the current global model to the devices for local training, and
aggregates the local model updates from the devices to update the global model.
Previous work on the communication efficiency of FL has mainly focused on the
aggregation of model updates from the devices, assuming perfect broadcasting of
the global model. In this paper, we instead consider broadcasting a compressed
version of the global model. This is to further reduce the communication cost
of FL, which can be particularly limited when the global model is to be
transmitted over a wireless medium. We introduce a lossy FL (LFL) algorithm, in
which both the global model and the local model updates are quantized before
being transmitted. We analyze the convergence behavior of the proposed LFL
algorithm assuming the availability of accurate local model updates at the
server. Numerical experiments show that the proposed LFL scheme, which
quantizes the global model update (with respect to the global model estimate at
the devices) rather than the global model itself, significantly outperforms
other existing schemes studying quantization of the global model at the
PS-to-device direction. Also, the performance loss of the proposed scheme is
marginal compared to the fully lossless approach, where the PS and the devices
transmit their messages entirely without any quantization.
Related papers
- Federated Learning with MMD-based Early Stopping for Adaptive GNSS Interference Classification [4.674584508653125]
Federated learning (FL) enables multiple devices to collaboratively train a global model while maintaining data on local servers.
We propose an FL approach using few-shot learning and aggregation of the model weights on a global server.
An exemplary application of FL is orchestrating machine learning models along highways for interference classification based on snapshots from global navigation satellite system (GNSS) receivers.
arXiv Detail & Related papers (2024-10-21T06:43:04Z) - Stragglers-Aware Low-Latency Synchronous Federated Learning via Layer-Wise Model Updates [71.81037644563217]
Synchronous federated learning (FL) is a popular paradigm for collaborative edge learning.
As some of the devices may have limited computational resources and varying availability, FL latency is highly sensitive to stragglers.
We propose straggler-aware layer-wise federated learning (SALF) that leverages the optimization procedure of NNs via backpropagation to update the global model in a layer-wise fashion.
arXiv Detail & Related papers (2024-03-27T09:14:36Z) - Tunable Soft Prompts are Messengers in Federated Learning [55.924749085481544]
Federated learning (FL) enables multiple participants to collaboratively train machine learning models using decentralized data sources.
The lack of model privacy protection in FL becomes an unneglectable challenge.
We propose a novel FL training approach that accomplishes information exchange among participants via tunable soft prompts.
arXiv Detail & Related papers (2023-11-12T11:01:10Z) - FedSoup: Improving Generalization and Personalization in Federated
Learning via Selective Model Interpolation [32.36334319329364]
Cross-silo federated learning (FL) enables the development of machine learning models on datasets distributed across data centers.
Recent research has found that current FL algorithms face a trade-off between local and global performance when confronted with distribution shifts.
We propose a novel federated model soup method to optimize the trade-off between local and global performance.
arXiv Detail & Related papers (2023-07-20T00:07:29Z) - Fed-FSNet: Mitigating Non-I.I.D. Federated Learning via Fuzzy
Synthesizing Network [19.23943687834319]
Federated learning (FL) has emerged as a promising privacy-preserving distributed machine learning framework.
We propose a novel FL training framework, dubbed Fed-FSNet, using a properly designed Fuzzy Synthesizing Network (FSNet) to mitigate the Non-I.I.D. at-the-source issue.
arXiv Detail & Related papers (2022-08-21T18:40:51Z) - Parallel Successive Learning for Dynamic Distributed Model Training over
Heterogeneous Wireless Networks [50.68446003616802]
Federated learning (FedL) has emerged as a popular technique for distributing model training over a set of wireless devices.
We develop parallel successive learning (PSL), which expands the FedL architecture along three dimensions.
Our analysis sheds light on the notion of cold vs. warmed up models, and model inertia in distributed machine learning.
arXiv Detail & Related papers (2022-02-07T05:11:01Z) - Gradual Federated Learning with Simulated Annealing [26.956032164461377]
Federated averaging (FedAvg) is a popular federated learning (FL) technique that updates the global model by averaging local models.
In this paper, we propose a new FL technique based on simulated annealing.
We show that SAFL outperforms the conventional FedAvg technique in terms of the convergence speed and the classification accuracy.
arXiv Detail & Related papers (2021-10-11T11:57:56Z) - Federated Learning with Downlink Device Selection [92.14944020945846]
We study federated edge learning, where a global model is trained collaboratively using privacy-sensitive data at the edge of a wireless network.
A parameter server (PS) keeps track of the global model and shares it with the wireless edge devices for training using their private local data.
We consider device selection based on downlink channels over which the PS shares the global model with the devices.
arXiv Detail & Related papers (2021-07-07T22:42:39Z) - Think Locally, Act Globally: Federated Learning with Local and Global
Representations [92.68484710504666]
Federated learning is a method of training models on private data distributed over multiple devices.
We propose a new federated learning algorithm that jointly learns compact local representations on each device.
We also evaluate on the task of personalized mood prediction from real-world mobile data where privacy is key.
arXiv Detail & Related papers (2020-01-06T12:40:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.