FedComm: Federated Learning as a Medium for Covert Communication
- URL: http://arxiv.org/abs/2201.08786v3
- Date: Wed, 17 May 2023 14:37:18 GMT
- Title: FedComm: Federated Learning as a Medium for Covert Communication
- Authors: Dorjan Hitaj, Giulio Pagnotta, Briland Hitaj, Fernando Perez-Cruz,
Luigi V. Mancini
- Abstract summary: Federated Learning (FL) is a solution to mitigate the privacy implications related to the adoption of deep learning.
This paper thoroughly investigates the communication capabilities of an FL scheme.
We introduce FedComm, a novel multi-system covert-communication technique.
- Score: 56.376997104843355
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Proposed as a solution to mitigate the privacy implications related to the
adoption of deep learning, Federated Learning (FL) enables large numbers of
participants to successfully train deep neural networks without having to
reveal the actual private training data. To date, a substantial amount of
research has investigated the security and privacy properties of FL, resulting
in a plethora of innovative attack and defense strategies. This paper
thoroughly investigates the communication capabilities of an FL scheme. In
particular, we show that a party involved in the FL learning process can use FL
as a covert communication medium to send an arbitrary message. We introduce
FedComm, a novel multi-system covert-communication technique that enables
robust sharing and transfer of targeted payloads within the FL framework. Our
extensive theoretical and empirical evaluations show that FedComm provides a
stealthy communication channel, with minimal disruptions to the training
process. Our experiments show that FedComm successfully delivers 100% of a
payload in the order of kilobits before the FL procedure converges. Our
evaluation also shows that FedComm is independent of the application domain and
the neural network architecture used by the underlying FL scheme.
Related papers
- Exploring the Practicality of Federated Learning: A Survey Towards the Communication Perspective [1.088537320059347]
Federated Learning (FL) is a promising paradigm that offers significant advancements in privacy-preserving, decentralized machine learning.
However, the practical deployment of FL systems faces a significant bottleneck: the communication overhead.
This survey investigates various strategies and advancements made in communication-efficient FL.
arXiv Detail & Related papers (2024-05-30T19:21:33Z) - FedComLoc: Communication-Efficient Distributed Training of Sparse and Quantized Models [56.21666819468249]
Federated Learning (FL) has garnered increasing attention due to its unique characteristic of allowing heterogeneous clients to process their private data locally and interact with a central server.
We introduce FedComLoc, integrating practical and effective compression into emphScaffnew to further enhance communication efficiency.
arXiv Detail & Related papers (2024-03-14T22:29:59Z) - A Survey on Efficient Federated Learning Methods for Foundation Model Training [62.473245910234304]
Federated Learning (FL) has become an established technique to facilitate privacy-preserving collaborative training across a multitude of clients.
In the wake of Foundation Models (FM), the reality is different for many deep learning applications.
We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications.
arXiv Detail & Related papers (2024-01-09T10:22:23Z) - Communication Efficient and Privacy-Preserving Federated Learning Based
on Evolution Strategies [0.0]
Federated learning (FL) is an emerging paradigm for training deep neural networks (DNNs) in distributed manners.
In this work, we present a federated learning algorithm based on evolution strategies (FedES), a zeroth-order training method.
arXiv Detail & Related papers (2023-11-05T21:40:46Z) - UFed-GAN: A Secure Federated Learning Framework with Constrained
Computation and Unlabeled Data [50.13595312140533]
We propose a novel framework of UFed-GAN: Unsupervised Federated Generative Adversarial Network, which can capture user-side data distribution without local classification training.
Our experimental results demonstrate the strong potential of UFed-GAN in addressing limited computational resources and unlabeled data while preserving privacy.
arXiv Detail & Related papers (2023-08-10T22:52:13Z) - FedNC: A Secure and Efficient Federated Learning Method with Network
Coding [18.556693764310328]
Federated Learning (FL) is a promising distributed learning mechanism which faces two major challenges, namely privacy breaches and system efficiency.
In this work, we reconceptualize the FL system from the perspective of network information theory, and formulate an original FL communication framework, FedNC, which is inspired by Network Coding (NC)
arXiv Detail & Related papers (2023-05-05T05:47:40Z) - FedLP: Layer-wise Pruning Mechanism for Communication-Computation
Efficient Federated Learning [15.665720478360557]
Federated learning (FL) has prevailed as an efficient and privacy-preserved scheme for distributed learning.
We formulate an explicit FL pruning framework, FedLP (Federated Layer-wise Pruning), which is model-agnostic and universal for different types of deep learning models.
arXiv Detail & Related papers (2023-03-11T09:57:00Z) - FedDBL: Communication and Data Efficient Federated Deep-Broad Learning
for Histopathological Tissue Classification [65.7405397206767]
We propose Federated Deep-Broad Learning (FedDBL) to achieve superior classification performance with limited training samples and only one-round communication.
FedDBL greatly outperforms the competitors with only one-round communication and limited training samples, while it even achieves comparable performance with the ones under multiple-round communications.
Since no data or deep model sharing across different clients, the privacy issue is well-solved and the model security is guaranteed with no model inversion attack risk.
arXiv Detail & Related papers (2023-02-24T14:27:41Z) - On the Design of Communication-Efficient Federated Learning for Health
Monitoring [21.433739206682404]
We propose a communication-efficient federated learning (CEFL) framework that involves clients clustering and transfer learning.
CEFL can save up to 98.45% in communication costs while conceding less than 3% in accuracy loss, when compared to the conventional FL.
arXiv Detail & Related papers (2022-11-30T12:52:23Z) - Federated Robustness Propagation: Sharing Adversarial Robustness in
Federated Learning [98.05061014090913]
Federated learning (FL) emerges as a popular distributed learning schema that learns from a set of participating users without requiring raw data to be shared.
adversarial training (AT) provides a sound solution for centralized learning, extending its usage for FL users has imposed significant challenges.
We show that existing FL techniques cannot effectively propagate adversarial robustness among non-iid users.
We propose a simple yet effective propagation approach that transfers robustness through carefully designed batch-normalization statistics.
arXiv Detail & Related papers (2021-06-18T15:52:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.