Towards practical differentially private causal graph discovery
- URL: http://arxiv.org/abs/2006.08598v1
- Date: Mon, 15 Jun 2020 18:30:41 GMT
- Title: Towards practical differentially private causal graph discovery
- Authors: Lun Wang and Qi Pang and Dawn Song
- Abstract summary: Causal graph discovery refers to the process of discovering causal relation graphs from purely observational data.
We present a differentially private causal graph discovery algorithm, Priv-PC, which improves both utility and running time compared to the state-of-the-art.
- Score: 74.7791110594082
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Causal graph discovery refers to the process of discovering causal relation
graphs from purely observational data. Like other statistical data, a causal
graph might leak sensitive information about participants in the dataset. In
this paper, we present a differentially private causal graph discovery
algorithm, Priv-PC, which improves both utility and running time compared to
the state-of-the-art. The design of Priv-PC follows a novel paradigm called
sieve-and-examine which uses a small amount of privacy budget to filter out
"insignificant" queries, and leverages the remaining budget to obtain highly
accurate answers for the "significant" queries. We also conducted the first
sensitivity analysis for conditional independence tests including conditional
Kendall's tau and conditional Spearman's rho. We evaluated Priv-PC on 4 public
datasets and compared with the state-of-the-art. The results show that Priv-PC
achieves 10.61 to 32.85 times speedup and better utility.
Related papers
- Locally Private Histograms in All Privacy Regimes [14.453502159917525]
We investigate locally private histograms, and the very related distribution learning task, in a medium-to-low privacy regime.
We obtain a protocol for histograms in the emphlocal model of differential privacy, with accuracy matching previous algorithms but significantly better message and communication complexity.
Our theoretical findings emerge from a novel analysis, which appears to improve bounds across the board for the locally private histogram problem.
arXiv Detail & Related papers (2024-08-09T06:22:45Z) - Privacy Profiles for Private Selection [21.162924003105484]
We work out an easy-to-use recipe that bounds privacy profiles of ReportNoisyMax and PrivateTuning using the privacy profiles of the base algorithms they corral.
Our approach improves over all regimes of interest and leads to substantial benefits in end-to-end private learning experiments.
arXiv Detail & Related papers (2024-02-09T08:31:46Z) - Independent Distribution Regularization for Private Graph Embedding [55.24441467292359]
Graph embeddings are susceptible to attribute inference attacks, which allow attackers to infer private node attributes from the learned graph embeddings.
To address these concerns, privacy-preserving graph embedding methods have emerged.
We propose a novel approach called Private Variational Graph AutoEncoders (PVGAE) with the aid of independent distribution penalty as a regularization term.
arXiv Detail & Related papers (2023-08-16T13:32:43Z) - Causal Inference with Differentially Private (Clustered) Outcomes [16.166525280886578]
Estimating causal effects from randomized experiments is only feasible if participants agree to reveal their responses.
We suggest a new differential privacy mechanism, Cluster-DP, which leverages any given cluster structure.
We show that, depending on an intuitive measure of cluster quality, we can improve the variance loss while maintaining our privacy guarantees.
arXiv Detail & Related papers (2023-08-02T05:51:57Z) - Adaptive Privacy Composition for Accuracy-first Mechanisms [55.53725113597539]
Noise reduction mechanisms produce increasingly accurate answers.
Analysts only pay the privacy cost of the least noisy or most accurate answer released.
There has yet to be any study on how ex-post private mechanisms compose.
We develop privacy filters that allow an analyst to adaptively switch between differentially private and ex-post private mechanisms.
arXiv Detail & Related papers (2023-06-24T00:33:34Z) - Differentially Private Topological Data Analysis [6.983833229285599]
This paper is the first to attempt differentially private (DP) topological data analysis (TDA)
We show that the commonly used vCech complex has sensitivity that does not decrease as the sample size $n$ increases.
We propose using the exponential mechanism whose utility function is defined in terms of the bottleneck distance of the $L1$-DTM persistence diagrams.
arXiv Detail & Related papers (2023-05-05T15:15:04Z) - Analyzing Privacy Leakage in Machine Learning via Multiple Hypothesis
Testing: A Lesson From Fano [83.5933307263932]
We study data reconstruction attacks for discrete data and analyze it under the framework of hypothesis testing.
We show that if the underlying private data takes values from a set of size $M$, then the target privacy parameter $epsilon$ can be $O(log M)$ before the adversary gains significant inferential power.
arXiv Detail & Related papers (2022-10-24T23:50:12Z) - Smooth Anonymity for Sparse Graphs [69.1048938123063]
differential privacy has emerged as the gold standard of privacy, however, when it comes to sharing sparse datasets.
In this work, we consider a variation of $k$-anonymity, which we call smooth-$k$-anonymity, and design simple large-scale algorithms that efficiently provide smooth-$k$-anonymity.
arXiv Detail & Related papers (2022-07-13T17:09:25Z) - Individual Privacy Accounting for Differentially Private Stochastic Gradient Descent [69.14164921515949]
We characterize privacy guarantees for individual examples when releasing models trained by DP-SGD.
We find that most examples enjoy stronger privacy guarantees than the worst-case bound.
This implies groups that are underserved in terms of model utility simultaneously experience weaker privacy guarantees.
arXiv Detail & Related papers (2022-06-06T13:49:37Z) - Auditing Differentially Private Machine Learning: How Private is Private
SGD? [16.812900569416062]
We investigate whether Differentially Private SGD offers better privacy in practice than what is guaranteed by its state-of-the-art analysis.
We do so via novel data poisoning attacks, which we show correspond to realistic privacy attacks.
arXiv Detail & Related papers (2020-06-13T20:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.