Learning With Differential Privacy
- URL: http://arxiv.org/abs/2006.05609v2
- Date: Thu, 11 Jun 2020 14:11:44 GMT
- Title: Learning With Differential Privacy
- Authors: Poushali Sengupta, Sudipta Paul, Subhankar Mishra
- Abstract summary: Differential privacy comes to the rescue with a proper promise of protection against leakage.
It uses a randomized response technique at the time of collection of the data which promises strong privacy with better utility.
- Score: 3.618133010429131
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The leakage of data might have been an extreme effect on the personal level
if it contains sensitive information. Common prevention methods like
encryption-decryption, endpoint protection, intrusion detection system are
prone to leakage. Differential privacy comes to the rescue with a proper
promise of protection against leakage, as it uses a randomized response
technique at the time of collection of the data which promises strong privacy
with better utility. Differential privacy allows one to access the forest of
data by describing their pattern of groups without disclosing any individual
trees. The current adaption of differential privacy by leading tech companies
and academia encourages authors to explore the topic in detail. The different
aspects of differential privacy, it's application in privacy protection and
leakage of information, a comparative discussion, on the current research
approaches in this field, its utility in the real world as well as the
trade-offs - will be discussed.
Related papers
- Differential Privacy Overview and Fundamental Techniques [63.0409690498569]
This chapter is meant to be part of the book "Differential Privacy in Artificial Intelligence: From Theory to Practice"
It starts by illustrating various attempts to protect data privacy, emphasizing where and why they failed.
It then defines the key actors, tasks, and scopes that make up the domain of privacy-preserving data analysis.
arXiv Detail & Related papers (2024-11-07T13:52:11Z) - Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy [14.40391109414476]
We design and evaluate new explanations of differential privacy for the local and central models.
We find that consequences-focused explanations in the style of privacy nutrition labels are a promising approach for setting accurate privacy expectations.
arXiv Detail & Related papers (2024-08-16T01:21:57Z) - $\alpha$-Mutual Information: A Tunable Privacy Measure for Privacy
Protection in Data Sharing [4.475091558538915]
This paper adopts Arimoto's $alpha$-Mutual Information as a tunable privacy measure.
We formulate a general distortion-based mechanism that manipulates the original data to offer privacy protection.
arXiv Detail & Related papers (2023-10-27T16:26:14Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - The Privacy Onion Effect: Memorization is Relative [76.46529413546725]
We show an Onion Effect of memorization: removing the "layer" of outlier points that are most vulnerable exposes a new layer of previously-safe points to the same attack.
It suggests that privacy-enhancing technologies such as machine unlearning could actually harm the privacy of other users.
arXiv Detail & Related papers (2022-06-21T15:25:56Z) - HyObscure: Hybrid Obscuring for Privacy-Preserving Data Publishing [7.554593344695387]
Minimizing privacy leakage while ensuring data utility is a critical problem to data holders in a privacy-preserving data publishing task.
Most prior research concerns only with one type of data and resorts to a single obscuring method.
This work takes a pilot study on privacy-preserving data publishing when both generalization and obfuscation operations are employed.
arXiv Detail & Related papers (2021-12-15T03:04:00Z) - "I need a better description'': An Investigation Into User Expectations
For Differential Privacy [31.352325485393074]
We explore users' privacy expectations related to differential privacy.
We find that users care about the kinds of information leaks against which differential privacy protects.
We find that the ways in which differential privacy is described in-the-wild haphazardly set users' privacy expectations.
arXiv Detail & Related papers (2021-10-13T02:36:37Z) - Swarm Differential Privacy for Purpose Driven
Data-Information-Knowledge-Wisdom Architecture [2.38142799291692]
We will explore the privacy protection of the broad Data-InformationKnowledge-Wisdom (DIKW) landscape.
As differential privacy proved to be an effective data privacy approach, we will look at it from a DIKW domain perspective.
Swarm Intelligence could effectively optimize and reduce the number of items in DIKW used in differential privacy.
arXiv Detail & Related papers (2021-05-09T23:09:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.