Approximate Differential Privacy of the $\ell_2$ Mechanism
- URL: http://arxiv.org/abs/2502.15929v1
- Date: Fri, 21 Feb 2025 20:56:34 GMT
- Title: Approximate Differential Privacy of the $\ell_2$ Mechanism
- Authors: Matthew Joseph, Alex Kulesza, Alexander Yu,
- Abstract summary: We study the $ell$ mechanism for computing a $d$-dimensional statistic with bounded $ell$ sensitivity under approximate differential privacy.<n>Across a range of privacy parameters, we find that the $ell$ mechanism obtains lower error than the Laplace and Gaussian mechanisms.
- Score: 52.61055173572399
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the $\ell_2$ mechanism for computing a $d$-dimensional statistic with bounded $\ell_2$ sensitivity under approximate differential privacy. Across a range of privacy parameters, we find that the $\ell_2$ mechanism obtains lower error than the Laplace and Gaussian mechanisms, matching the former at $d=1$ and approaching the latter as $d \to \infty$.
Related papers
- Optimality of Matrix Mechanism on $\ell_p^p$-metric [6.076406622352115]
We characterize such an error under $(epsilon,delta)$-differential privacy.
We give tight bounds on answering prefix sum and parity queries under differential privacy for all constant $p$ in terms of the $ell_pp$ error.
arXiv Detail & Related papers (2024-06-04T09:27:35Z) - Smooth Sensitivity for Geo-Privacy [17.835910182900985]
Local model of differential privacy (LDP) is the predominant model under which the problem has been studied.
Geo-Privacy (GP) stipulates that the level of distinguishability be proportional to $mathrmdist(x_i, x_i')$.
We generalize the smooth sensitivity framework from Differential Privacy to Geo-Privacy, which allows us to add noise tailored to the hardness of the given instance.
arXiv Detail & Related papers (2024-05-10T08:32:07Z) - General Gaussian Noise Mechanisms and Their Optimality for Unbiased Mean
Estimation [58.03500081540042]
A classical approach to private mean estimation is to compute the true mean and add unbiased, but possibly correlated, Gaussian noise to it.
We show that for every input dataset, an unbiased mean estimator satisfying concentrated differential privacy introduces approximately at least as much error.
arXiv Detail & Related papers (2023-01-31T18:47:42Z) - Contraction of Locally Differentially Private Mechanisms [8.547801169402923]
We derive tight bounds on the divergence between $PK$ and $QK$ output distributions of an $epsilon$-LDP mechanism $K$.
We then utilize these bounds to establish locally private versions of the van Trees inequality, Le Cam's, Assouad's, and the mutual information methods.
arXiv Detail & Related papers (2022-10-24T16:38:12Z) - Faster Privacy Accounting via Evolving Discretization [54.32252900997422]
We introduce a new algorithm for numerical composition of privacy random variables.
Our algorithm achieves a running time and memory usage of $mathrmpolylog(k)$ for the task of self-composing a mechanism.
arXiv Detail & Related papers (2022-07-10T04:25:37Z) - A unified interpretation of the Gaussian mechanism for differential
privacy through the sensitivity index [61.675604648670095]
We argue that the three prevailing interpretations of the GM, namely $(varepsilon, delta)$-DP, f-DP and R'enyi DP can be expressed by using a single parameter $psi$, which we term the sensitivity index.
$psi$ uniquely characterises the GM and its properties by encapsulating its two fundamental quantities: the sensitivity of the query and the magnitude of the noise perturbation.
arXiv Detail & Related papers (2021-09-22T06:20:01Z) - Locally differentially private estimation of nonlinear functionals of
discrete distributions [9.028773906859541]
We study the problem of estimating non-linear functionals of discrete distributions in the context of local differential privacy.
Only $alpha$-locally differentially private (LDP) samples are publicly available, where the term 'local' means that each $z_i$ is produced using one individual $x_i$.
We describe the behavior of the quadratic risk for estimating the power sum functional $F_gamma = sum_k=1K p_kgamma$, $gamma >0$ as a function of $K,, n
arXiv Detail & Related papers (2021-07-08T16:11:10Z) - Private Stochastic Convex Optimization: Optimal Rates in $\ell_1$
Geometry [69.24618367447101]
Up to logarithmic factors the optimal excess population loss of any $(varepsilon,delta)$-differently private is $sqrtlog(d)/n + sqrtd/varepsilon n.$
We show that when the loss functions satisfy additional smoothness assumptions, the excess loss is upper bounded (up to logarithmic factors) by $sqrtlog(d)/n + (log(d)/varepsilon n)2/3.
arXiv Detail & Related papers (2021-03-02T06:53:44Z) - A bounded-noise mechanism for differential privacy [3.9596068699962323]
We output an approximation of the average $frac1nsum_i=1n vecx(i)$ of vectors $vecx(i) in [0,1]k$, while preserving the privacy with respect to any $vecx(i)$.
We present an $(epsilon,delta)$-private mechanism with optimal $ell_infty$ error for most values of $delta$.
arXiv Detail & Related papers (2020-12-07T16:03:21Z) - Locally Private Hypothesis Selection [96.06118559817057]
We output a distribution from $mathcalQ$ whose total variation distance to $p$ is comparable to the best such distribution.
We show that the constraint of local differential privacy incurs an exponential increase in cost.
Our algorithms result in exponential improvements on the round complexity of previous methods.
arXiv Detail & Related papers (2020-02-21T18:30:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.