Composition of Differential Privacy & Privacy Amplification by
Subsampling
- URL: http://arxiv.org/abs/2210.00597v2
- Date: Tue, 4 Oct 2022 21:41:58 GMT
- Title: Composition of Differential Privacy & Privacy Amplification by
Subsampling
- Authors: Thomas Steinke
- Abstract summary: This chapter is meant to be part of the book "Differential Privacy for Artificial Intelligence Applications"
We give an introduction to the most important property of differential privacy -- composition.
This chapter introduces the basic concepts and gives proofs of the key results needed to apply these tools in practice.
- Score: 18.397305289651907
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This chapter is meant to be part of the book "Differential Privacy for
Artificial Intelligence Applications." We give an introduction to the most
important property of differential privacy -- composition: running multiple
independent analyses on the data of a set of people will still be
differentially private as long as each of the analyses is private on its own --
as well as the related topic of privacy amplification by subsampling. This
chapter introduces the basic concepts and gives proofs of the key results
needed to apply these tools in practice.
Related papers
- Differential Privacy Overview and Fundamental Techniques [63.0409690498569]
This chapter is meant to be part of the book "Differential Privacy in Artificial Intelligence: From Theory to Practice"
It starts by illustrating various attempts to protect data privacy, emphasizing where and why they failed.
It then defines the key actors, tasks, and scopes that make up the domain of privacy-preserving data analysis.
arXiv Detail & Related papers (2024-11-07T13:52:11Z) - A Statistical Viewpoint on Differential Privacy: Hypothesis Testing, Representation and Blackwell's Theorem [30.365274034429508]
We argue that differential privacy can be considered a textitpure statistical concept.
$f$-differential privacy is a unified framework for analyzing privacy bounds in data analysis and machine learning.
arXiv Detail & Related papers (2024-09-14T23:47:22Z) - Privacy-Preserving ECG Data Analysis with Differential Privacy: A Literature Review and A Case Study [1.1156009461711638]
We provide an overview of key concepts in differential privacy, followed by a literature review and discussion of its application to ECG analysis.
In the second part of the paper, we explore how to implement differentially private query release on an arrhythmia database using a six-step process.
arXiv Detail & Related papers (2024-06-19T23:17:16Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Post-processing of Differentially Private Data: A Fairness Perspective [53.29035917495491]
This paper shows that post-processing causes disparate impacts on individuals or groups.
It analyzes two critical settings: the release of differentially private datasets and the use of such private datasets for downstream decisions.
It proposes a novel post-processing mechanism that is (approximately) optimal under different fairness metrics.
arXiv Detail & Related papers (2022-01-24T02:45:03Z) - Decision Making with Differential Privacy under a Fairness Lens [65.16089054531395]
The U.S. Census Bureau releases data sets and statistics about groups of individuals that are used as input to a number of critical decision processes.
To conform to privacy and confidentiality requirements, these agencies are often required to release privacy-preserving versions of the data.
This paper studies the release of differentially private data sets and analyzes their impact on some critical resource allocation tasks under a fairness perspective.
arXiv Detail & Related papers (2021-05-16T21:04:19Z) - Applications of Differential Privacy in Social Network Analysis: A
Survey [60.696428840516724]
Differential privacy is effective in sharing information and preserving privacy with a strong guarantee.
Social network analysis has been extensively adopted in many applications, opening a new arena for the application of differential privacy.
arXiv Detail & Related papers (2020-10-06T19:06:03Z) - Auditing Differentially Private Machine Learning: How Private is Private
SGD? [16.812900569416062]
We investigate whether Differentially Private SGD offers better privacy in practice than what is guaranteed by its state-of-the-art analysis.
We do so via novel data poisoning attacks, which we show correspond to realistic privacy attacks.
arXiv Detail & Related papers (2020-06-13T20:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.