Using Decentralized Aggregation for Federated Learning with Differential
Privacy
- URL: http://arxiv.org/abs/2311.16008v1
- Date: Mon, 27 Nov 2023 17:02:56 GMT
- Title: Using Decentralized Aggregation for Federated Learning with Differential
Privacy
- Authors: Hadeel Abd El-Kareem and Abd El-Moaty Saleh and Ana Fern\'andez-Vilas
and Manuel Fern\'andez-Veiga and asser El-Sonbaty
- Abstract summary: Federated Learning (FL) provides some level of privacy by retaining the data at the local node.
This research deploys an experimental environment for FL with Differential Privacy (DP) using benchmark datasets.
- Score: 0.32985979395737774
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Nowadays, the ubiquitous usage of mobile devices and networks have raised
concerns about the loss of control over personal data and research advance
towards the trade-off between privacy and utility in scenarios that combine
exchange communications, big databases and distributed and collaborative (P2P)
Machine Learning techniques. On the other hand, although Federated Learning
(FL) provides some level of privacy by retaining the data at the local node,
which executes a local training to enrich a global model, this scenario is
still susceptible to privacy breaches as membership inference attacks. To
provide a stronger level of privacy, this research deploys an experimental
environment for FL with Differential Privacy (DP) using benchmark datasets. The
obtained results show that the election of parameters and techniques of DP is
central in the aforementioned trade-off between privacy and utility by means of
a classification example.
Related papers
- Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Towards Split Learning-based Privacy-Preserving Record Linkage [49.1574468325115]
Split Learning has been introduced to facilitate applications where user data privacy is a requirement.
In this paper, we investigate the potentials of Split Learning for Privacy-Preserving Record Matching.
arXiv Detail & Related papers (2024-09-02T09:17:05Z) - Privacy in Federated Learning [0.0]
Federated Learning (FL) represents a significant advancement in distributed machine learning.
This chapter delves into the core privacy concerns within FL, including the risks of data reconstruction, model inversion attacks, and membership inference.
It examines the trade-offs between model accuracy and privacy, emphasizing the importance of balancing these factors in practical implementations.
arXiv Detail & Related papers (2024-08-12T18:41:58Z) - FewFedPIT: Towards Privacy-preserving and Few-shot Federated Instruction Tuning [54.26614091429253]
Federated instruction tuning (FedIT) is a promising solution, by consolidating collaborative training across multiple data owners.
FedIT encounters limitations such as scarcity of instructional data and risk of exposure to training data extraction attacks.
We propose FewFedPIT, designed to simultaneously enhance privacy protection and model performance of federated few-shot learning.
arXiv Detail & Related papers (2024-03-10T08:41:22Z) - Local Privacy-preserving Mechanisms and Applications in Machine Learning [0.21268495173320798]
Local Differential Privacy (LDP) provides strong privacy protection for individual users during the stages of data collection and processing.
One of the major applications of the privacy-preserving mechanisms is machine learning.
arXiv Detail & Related papers (2024-01-08T22:29:00Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - Personalization Improves Privacy-Accuracy Tradeoffs in Federated
Optimization [57.98426940386627]
We show that coordinating local learning with private centralized learning yields a generically useful and improved tradeoff between accuracy and privacy.
We illustrate our theoretical results with experiments on synthetic and real-world datasets.
arXiv Detail & Related papers (2022-02-10T20:44:44Z) - Mitigating Leakage from Data Dependent Communications in Decentralized
Computing using Differential Privacy [1.911678487931003]
We propose a general execution model to control the data-dependence of communications in user-side decentralized computations.
Our formal privacy guarantees leverage and extend recent results on privacy amplification by shuffling.
arXiv Detail & Related papers (2021-12-23T08:30:17Z) - Distributed Machine Learning and the Semblance of Trust [66.1227776348216]
Federated Learning (FL) allows the data owner to maintain data governance and perform model training locally without having to share their data.
FL and related techniques are often described as privacy-preserving.
We explain why this term is not appropriate and outline the risks associated with over-reliance on protocols that were not designed with formal definitions of privacy in mind.
arXiv Detail & Related papers (2021-12-21T08:44:05Z) - A Review of Privacy-preserving Federated Learning for the
Internet-of-Things [3.3517146652431378]
This work reviews federated learning as an approach for performing machine learning on distributed data.
We aim to protect the privacy of user-generated data as well as reducing communication costs associated with data transfer.
We identify the strengths and weaknesses of different methods applied to federated learning.
arXiv Detail & Related papers (2020-04-24T15:27:23Z) - Concentrated Differentially Private and Utility Preserving Federated
Learning [24.239992194656164]
Federated learning is a machine learning setting where a set of edge devices collaboratively train a model under the orchestration of a central server.
In this paper, we develop a federated learning approach that addresses the privacy challenge without much degradation on model utility.
We provide a tight end-to-end privacy guarantee of our approach and analyze its theoretical convergence rates.
arXiv Detail & Related papers (2020-03-30T19:20:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.