Privacy-Preserving Distributed Optimization and Learning
- URL: http://arxiv.org/abs/2403.00157v1
- Date: Thu, 29 Feb 2024 22:18:05 GMT
- Title: Privacy-Preserving Distributed Optimization and Learning
- Authors: Ziqin Chen and Yongqiang Wang
- Abstract summary: We discuss cryptography, differential privacy, and other techniques that can be used for privacy preservation.
We introduce several differential-privacy algorithms that can simultaneously ensure privacy and optimization accuracy.
We provide example applications in several machine learning problems to confirm the real-world effectiveness of these algorithms.
- Score: 2.1271873498506038
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Distributed optimization and learning has recently garnered great attention
due to its wide applications in sensor networks, smart grids, machine learning,
and so forth. Despite rapid development, existing distributed optimization and
learning algorithms require each agent to exchange messages with its neighbors,
which may expose sensitive information and raise significant privacy concerns.
In this survey paper, we overview privacy-preserving distributed optimization
and learning methods. We first discuss cryptography, differential privacy, and
other techniques that can be used for privacy preservation and indicate their
pros and cons for privacy protection in distributed optimization and learning.
We believe that among these approaches, differential privacy is most promising
due to its low computational and communication complexities, which are
extremely appealing for modern learning based applications with high dimensions
of optimization variables. We then introduce several differential-privacy
algorithms that can simultaneously ensure privacy and optimization accuracy.
Moreover, we provide example applications in several machine learning problems
to confirm the real-world effectiveness of these algorithms. Finally, we
highlight some challenges in this research domain and discuss future
directions.
Related papers
- Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Synergizing Privacy and Utility in Data Analytics Through Advanced Information Theorization [2.28438857884398]
We introduce three sophisticated algorithms: a Noise-Infusion Technique tailored for high-dimensional image data, a Variational Autoencoder (VAE) for robust feature extraction and an Expectation Maximization (EM) approach optimized for structured data privacy.
Our methods significantly reduce mutual information between sensitive attributes and transformed data, thereby enhancing privacy.
The research contributes to the field by providing a flexible and effective strategy for deploying privacy-preserving algorithms across various data types.
arXiv Detail & Related papers (2024-04-24T22:58:42Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - Locally Differentially Private Distributed Online Learning with Guaranteed Optimality [1.800614371653704]
This paper proposes an approach that ensures both differential privacy and learning accuracy in distributed online learning.
While ensuring a diminishing expected instantaneous regret, the approach can simultaneously ensure a finite cumulative privacy budget.
To the best of our knowledge, this is the first algorithm that successfully ensures both rigorous local differential privacy and learning accuracy.
arXiv Detail & Related papers (2023-06-25T02:05:34Z) - Theoretically Principled Federated Learning for Balancing Privacy and
Utility [61.03993520243198]
We propose a general learning framework for the protection mechanisms that protects privacy via distorting model parameters.
It can achieve personalized utility-privacy trade-off for each model parameter, on each client, at each communication round in federated learning.
arXiv Detail & Related papers (2023-05-24T13:44:02Z) - On Differential Privacy for Federated Learning in Wireless Systems with
Multiple Base Stations [90.53293906751747]
We consider a federated learning model in a wireless system with multiple base stations and inter-cell interference.
We show the convergence behavior of the learning process by deriving an upper bound on its optimality gap.
Our proposed scheduler improves the average accuracy of the predictions compared with a random scheduler.
arXiv Detail & Related papers (2022-08-25T03:37:11Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Swarm Differential Privacy for Purpose Driven
Data-Information-Knowledge-Wisdom Architecture [2.38142799291692]
We will explore the privacy protection of the broad Data-InformationKnowledge-Wisdom (DIKW) landscape.
As differential privacy proved to be an effective data privacy approach, we will look at it from a DIKW domain perspective.
Swarm Intelligence could effectively optimize and reduce the number of items in DIKW used in differential privacy.
arXiv Detail & Related papers (2021-05-09T23:09:07Z) - More Than Privacy: Applying Differential Privacy in Key Areas of
Artificial Intelligence [62.3133247463974]
We show that differential privacy can do more than just privacy preservation in AI.
It can also be used to improve security, stabilize learning, build fair models, and impose composition in selected areas of AI.
arXiv Detail & Related papers (2020-08-05T03:07:36Z) - Differentially private cross-silo federated learning [16.38610531397378]
Strict privacy is of paramount importance in distributed machine learning.
In this paper we combine additively homomorphic secure summation protocols with differential privacy in the so-called cross-silo federated learning setting.
We demonstrate that our proposed solutions give prediction accuracy that is comparable to the non-distributed setting.
arXiv Detail & Related papers (2020-07-10T18:15:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.