Communication-Efficient Federated Learning via Predictive Coding
- URL: http://arxiv.org/abs/2108.00918v1
- Date: Mon, 2 Aug 2021 14:12:19 GMT
- Title: Communication-Efficient Federated Learning via Predictive Coding
- Authors: Kai Yue, Richeng Jin, Chau-Wai Wong, Huaiyu Dai
- Abstract summary: Federated learning can enable remote workers to collaboratively train a shared machine learning model.
The communication overhead is a critical bottleneck due to limited power and bandwidth.
We propose a predictive coding based communication scheme for federated learning.
- Score: 38.778944321534084
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning can enable remote workers to collaboratively train a
shared machine learning model while allowing training data to be kept locally.
In the use case of wireless mobile devices, the communication overhead is a
critical bottleneck due to limited power and bandwidth. Prior work has utilized
various data compression tools such as quantization and sparsification to
reduce the overhead. In this paper, we propose a predictive coding based
communication scheme for federated learning. The scheme has shared prediction
functions among all devices and allows each worker to transmit a compressed
residual vector derived from the reference. In each communication round, we
select the predictor and quantizer based on the rate-distortion cost, and
further reduce the redundancy with entropy coding. Extensive simulations reveal
that the communication cost can be reduced up to 99% with even better learning
performance when compared with other baseline methods.
Related papers
- Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - An Efficient Federated Learning Framework for Training Semantic
Communication System [29.593406320684448]
Most semantic communication systems are built upon advanced deep learning models.
Due to privacy and security concerns, the transmission of data is restricted.
We introduce a mechanism to aggregate the global model from clients, called FedLol.
arXiv Detail & Related papers (2023-10-20T02:45:20Z) - Communication Efficient Distributed Learning over Wireless Channels [35.90632878033643]
Vertical distributed learning exploits the local features collected by multiple learning workers to form a better global model.
We propose a novel hierarchical distributed learning framework, where each worker separately learns a low-dimensional embedding of their local observed data.
We show that the proposed learning framework is able to achieve almost the same model accuracy as the learning model using the concatenation of all the raw outputs from the learning workers.
arXiv Detail & Related papers (2022-09-04T19:41:21Z) - A Machine Learning Framework for Distributed Functional Compression over
Wireless Channels in IoT [13.385373310554327]
IoT devices generate enormous data and state-of-the-art machine learning techniques together will revolutionize cyber-physical systems.
Traditional cloud-based methods that focus on transferring data to a central location either for training or inference place enormous strain on network resources.
We develop, to the best of our knowledge, the first machine learning framework for distributed functional compression over both the Gaussian Multiple Access Channel (GMAC) and AWGN channels.
arXiv Detail & Related papers (2022-01-24T06:38:39Z) - Collaborative Learning over Wireless Networks: An Introductory Overview [84.09366153693361]
We will mainly focus on collaborative training across wireless devices.
Many distributed optimization algorithms have been developed over the last decades.
They provide data locality; that is, a joint model can be trained collaboratively while the data available at each participating device remains local.
arXiv Detail & Related papers (2021-12-07T20:15:39Z) - ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training [65.68511423300812]
We propose ProgFed, a progressive training framework for efficient and effective federated learning.
ProgFed inherently reduces computation and two-way communication costs while maintaining the strong performance of the final models.
Our results show that ProgFed converges at the same rate as standard training on full models.
arXiv Detail & Related papers (2021-10-11T14:45:00Z) - A Linearly Convergent Algorithm for Decentralized Optimization: Sending
Less Bits for Free! [72.31332210635524]
Decentralized optimization methods enable on-device training of machine learning models without a central coordinator.
We propose a new randomized first-order method which tackles the communication bottleneck by applying randomized compression operators.
We prove that our method can solve the problems without any increase in the number of communications compared to the baseline.
arXiv Detail & Related papers (2020-11-03T13:35:53Z) - Federated Self-Supervised Learning of Multi-Sensor Representations for
Embedded Intelligence [8.110949636804772]
Smartphones, wearables, and Internet of Things (IoT) devices produce a wealth of data that cannot be accumulated in a centralized repository for learning supervised models.
We propose a self-supervised approach termed textitscalogram-signal correspondence learning based on wavelet transform to learn useful representations from unlabeled sensor inputs.
We extensively assess the quality of learned features with our multi-view strategy on diverse public datasets, achieving strong performance in all domains.
arXiv Detail & Related papers (2020-07-25T21:59:17Z) - A Compressive Sensing Approach for Federated Learning over Massive MIMO
Communication Systems [82.2513703281725]
Federated learning is a privacy-preserving approach to train a global model at a central server by collaborating with wireless devices.
We present a compressive sensing approach for federated learning over massive multiple-input multiple-output communication systems.
arXiv Detail & Related papers (2020-03-18T05:56:27Z) - Ternary Compression for Communication-Efficient Federated Learning [17.97683428517896]
Federated learning provides a potential solution to privacy-preserving and secure machine learning.
We propose a ternary federated averaging protocol (T-FedAvg) to reduce the upstream and downstream communication of federated learning systems.
Our results show that the proposed T-FedAvg is effective in reducing communication costs and can even achieve slightly better performance on non-IID data.
arXiv Detail & Related papers (2020-03-07T11:55:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.