Rate Region for Indirect Multiterminal Source Coding in Federated
Learning
- URL: http://arxiv.org/abs/2101.08696v2
- Date: Tue, 26 Jan 2021 13:24:24 GMT
- Title: Rate Region for Indirect Multiterminal Source Coding in Federated
Learning
- Authors: Naifu Zhang, Meixia Tao and Jia Wang
- Abstract summary: A large number of edge devices send their updates to the edge at each round of the local model.
Existing works do not leverage the focus in the information transmitted by different edges.
This paper studies the rate region for the indirect multiterminal source coding FL.
- Score: 49.574683687858126
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: One of the main focus in federated learning (FL) is the communication
efficiency since a large number of participating edge devices send their
updates to the edge server at each round of the model training. Existing works
reconstruct each model update from edge devices and implicitly assume that the
local model updates are independent over edge device. In FL, however, the model
update is an indirect multi-terminal source coding problem where each edge
device cannot observe directly the source that is to be reconstructed at the
decoder, but is rather provided only with a noisy version. The existing works
do not leverage the redundancy in the information transmitted by different
edges. This paper studies the rate region for the indirect multiterminal source
coding problem in FL. The goal is to obtain the minimum achievable rate at a
particular upper bound of gradient variance. We obtain the rate region for
multiple edge devices in general case and derive an explicit formula of the
sum-rate distortion function in the special case where gradient are identical
over edge device and dimension. Finally, we analysis communication efficiency
of convex Mini-batched SGD and non-convex Minibatched SGD based on the sum-rate
distortion function, respectively.
Related papers
- Fed-ZOE: Communication-Efficient Over-the-Air Federated Learning via Zeroth-Order Estimation [15.026407830543086]
Fed-ZOE is an efficient framework inspired by the randomized gradient estimator (RGE) commonly used in zeroth-order optimization (ZOO)
Fed-ZOE achieves performance comparable to Fed-OtA while drastically reducing communication costs.
arXiv Detail & Related papers (2024-12-21T21:24:58Z) - Rendering Wireless Environments Useful for Gradient Estimators: A Zero-Order Stochastic Federated Learning Method [14.986031916712108]
Cross-device federated learning (FL) is a growing machine learning framework whereby multiple edge devices collaborate to train a model without disclosing their raw data.
We show how to harness the wireless channel in the learning algorithm itself instead of to analyze it remove its impact.
arXiv Detail & Related papers (2024-01-30T21:46:09Z) - DETR Doesn't Need Multi-Scale or Locality Design [69.56292005230185]
This paper presents an improved DETR detector that maintains a "plain" nature.
It uses a single-scale feature map and global cross-attention calculations without specific locality constraints.
We show that two simple technologies are surprisingly effective within a plain design to compensate for the lack of multi-scale feature maps and locality constraints.
arXiv Detail & Related papers (2023-08-03T17:59:04Z) - DPCN++: Differentiable Phase Correlation Network for Versatile Pose
Registration [18.60311260250232]
We present a differentiable phase correlation solver that is globally convergent and correspondence-free.
We evaluate DCPN++ on a wide range of registration tasks taking different input modalities, including 2D bird's-eye view images, 3D object and scene measurements, and medical images.
arXiv Detail & Related papers (2022-06-12T10:00:34Z) - Multi-task Federated Edge Learning (MtFEEL) in Wireless Networks [1.9250873974729816]
Federated Learning (FL) has evolved as a promising technique to handle distributed machine learning across edge devices.
A novel communication efficient FL algorithm for personalised learning in a wireless setting with guarantees is presented.
arXiv Detail & Related papers (2021-08-05T10:54:38Z) - Adaptive Dynamic Pruning for Non-IID Federated Learning [3.8666113275834335]
Federated Learning(FL) has emerged as a new paradigm of training machine learning models without sacrificing data security and privacy.
We present an adaptive pruning scheme for edge devices in an FL system, which applies dataset-aware dynamic pruning for inference acceleration on Non-IID datasets.
arXiv Detail & Related papers (2021-06-13T05:27:43Z) - Bayesian Federated Learning over Wireless Networks [87.37301441859925]
Federated learning is a privacy-preserving and distributed training method using heterogeneous data sets stored at local devices.
This paper presents an efficient modified BFL algorithm called scalableBFL (SBFL)
arXiv Detail & Related papers (2020-12-31T07:32:44Z) - Over-the-Air Federated Learning from Heterogeneous Data [107.05618009955094]
Federated learning (FL) is a framework for distributed learning of centralized models.
We develop a Convergent OTA FL (COTAF) algorithm which enhances the common local gradient descent (SGD) FL algorithm.
We numerically show that the precoding induced by COTAF notably improves the convergence rate and the accuracy of models trained via OTA FL.
arXiv Detail & Related papers (2020-09-27T08:28:25Z) - A Compressive Sensing Approach for Federated Learning over Massive MIMO
Communication Systems [82.2513703281725]
Federated learning is a privacy-preserving approach to train a global model at a central server by collaborating with wireless devices.
We present a compressive sensing approach for federated learning over massive multiple-input multiple-output communication systems.
arXiv Detail & Related papers (2020-03-18T05:56:27Z) - Gradient Statistics Aware Power Control for Over-the-Air Federated
Learning [59.40860710441232]
Federated learning (FL) is a promising technique that enables many edge devices to train a machine learning model collaboratively in wireless networks.
This paper studies the power control problem for over-the-air FL by taking gradient statistics into account.
arXiv Detail & Related papers (2020-03-04T14:06:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.