Privacy and Bias Analysis of Disclosure Avoidance Systems
- URL: http://arxiv.org/abs/2301.12204v1
- Date: Sat, 28 Jan 2023 13:58:25 GMT
- Title: Privacy and Bias Analysis of Disclosure Avoidance Systems
- Authors: Keyu Zhu, Ferdinando Fioretto, Pascal Van Hentenryck, Saswat Das,
Christine Task
- Abstract summary: Disclosure avoidance (DA) systems are used to safeguard the confidentiality of data while allowing it to be analyzed and disseminated for analytic purposes.
This paper presents a framework that addresses this gap: it proposes differentially private versions of these mechanisms and derives their privacy bounds.
The results show that, contrary to popular beliefs, traditional differential privacy techniques may be superior in terms of accuracy and fairness to differential private counterparts of widely used DA mechanisms.
- Score: 45.645473465606564
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Disclosure avoidance (DA) systems are used to safeguard the confidentiality
of data while allowing it to be analyzed and disseminated for analytic
purposes. These methods, e.g., cell suppression, swapping, and k-anonymity, are
commonly applied and may have significant societal and economic implications.
However, a formal analysis of their privacy and bias guarantees has been
lacking. This paper presents a framework that addresses this gap: it proposes
differentially private versions of these mechanisms and derives their privacy
bounds. In addition, the paper compares their performance with traditional
differential privacy mechanisms in terms of accuracy and fairness on US Census
data release and classification tasks. The results show that, contrary to
popular beliefs, traditional differential privacy techniques may be superior in
terms of accuracy and fairness to differential private counterparts of widely
used DA mechanisms.
Related papers
- A Statistical Viewpoint on Differential Privacy: Hypothesis Testing, Representation and Blackwell's Theorem [30.365274034429508]
We argue that differential privacy can be considered a textitpure statistical concept.
$f$-differential privacy is a unified framework for analyzing privacy bounds in data analysis and machine learning.
arXiv Detail & Related papers (2024-09-14T23:47:22Z) - Privacy-Aware Randomized Quantization via Linear Programming [13.002534825666219]
We propose a family of quantization mechanisms that is unbiased and differentially private.
Our proposed mechanism can attain a better privacy-accuracy trade-off compared to baselines.
arXiv Detail & Related papers (2024-06-01T18:40:08Z) - Unified Mechanism-Specific Amplification by Subsampling and Group Privacy Amplification [54.1447806347273]
Amplification by subsampling is one of the main primitives in machine learning with differential privacy.
We propose the first general framework for deriving mechanism-specific guarantees.
We analyze how subsampling affects the privacy of groups of multiple users.
arXiv Detail & Related papers (2024-03-07T19:36:05Z) - The Symmetric alpha-Stable Privacy Mechanism [0.0]
We present novel analysis of the Symmetric alpha-Stable (SaS) mechanism.
We prove that the mechanism is purely differentially private while remaining closed under convolution.
arXiv Detail & Related papers (2023-11-29T16:34:39Z) - Adaptive Privacy Composition for Accuracy-first Mechanisms [55.53725113597539]
Noise reduction mechanisms produce increasingly accurate answers.
Analysts only pay the privacy cost of the least noisy or most accurate answer released.
There has yet to be any study on how ex-post private mechanisms compose.
We develop privacy filters that allow an analyst to adaptively switch between differentially private and ex-post private mechanisms.
arXiv Detail & Related papers (2023-06-24T00:33:34Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Post-processing of Differentially Private Data: A Fairness Perspective [53.29035917495491]
This paper shows that post-processing causes disparate impacts on individuals or groups.
It analyzes two critical settings: the release of differentially private datasets and the use of such private datasets for downstream decisions.
It proposes a novel post-processing mechanism that is (approximately) optimal under different fairness metrics.
arXiv Detail & Related papers (2022-01-24T02:45:03Z) - Distribution-Invariant Differential Privacy [4.700764053354502]
We develop a distribution-invariant privatization (DIP) method to reconcile high statistical accuracy and strict differential privacy.
Under the same strictness of privacy protection, DIP achieves superior statistical accuracy in two simulations and on three real-world benchmarks.
arXiv Detail & Related papers (2021-11-08T22:26:50Z) - Decision Making with Differential Privacy under a Fairness Lens [65.16089054531395]
The U.S. Census Bureau releases data sets and statistics about groups of individuals that are used as input to a number of critical decision processes.
To conform to privacy and confidentiality requirements, these agencies are often required to release privacy-preserving versions of the data.
This paper studies the release of differentially private data sets and analyzes their impact on some critical resource allocation tasks under a fairness perspective.
arXiv Detail & Related papers (2021-05-16T21:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.