Efficient Mean Estimation with Pure Differential Privacy via a
Sum-of-Squares Exponential Mechanism
- URL: http://arxiv.org/abs/2111.12981v1
- Date: Thu, 25 Nov 2021 09:31:15 GMT
- Title: Efficient Mean Estimation with Pure Differential Privacy via a
Sum-of-Squares Exponential Mechanism
- Authors: Samuel B. Hopkins, Gautam Kamath, Mahbod Majid
- Abstract summary: We give the first-time algorithm to estimate the mean of a $d$-positive probability distribution with covariance from $tildeO(d)$ independent samples subject to pure differential privacy.
Our main technique is a new approach to use the powerful Sum of Squares method (SoS) to design differentially private algorithms.
- Score: 16.996435043565594
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We give the first polynomial-time algorithm to estimate the mean of a
$d$-variate probability distribution with bounded covariance from
$\tilde{O}(d)$ independent samples subject to pure differential privacy. Prior
algorithms for this problem either incur exponential running time, require
$\Omega(d^{1.5})$ samples, or satisfy only the weaker concentrated or
approximate differential privacy conditions. In particular, all prior
polynomial-time algorithms require $d^{1+\Omega(1)}$ samples to guarantee small
privacy loss with "cryptographically" high probability, $1-2^{-d^{\Omega(1)}}$,
while our algorithm retains $\tilde{O}(d)$ sample complexity even in this
stringent setting.
Our main technique is a new approach to use the powerful Sum of Squares
method (SoS) to design differentially private algorithms. SoS proofs to
algorithms is a key theme in numerous recent works in high-dimensional
algorithmic statistics -- estimators which apparently require exponential
running time but whose analysis can be captured by low-degree Sum of Squares
proofs can be automatically turned into polynomial-time algorithms with the
same provable guarantees. We demonstrate a similar proofs to private algorithms
phenomenon: instances of the workhorse exponential mechanism which apparently
require exponential time but which can be analyzed with low-degree SoS proofs
can be automatically turned into polynomial-time differentially private
algorithms. We prove a meta-theorem capturing this phenomenon, which we expect
to be of broad use in private algorithm design.
Our techniques also draw new connections between differentially private and
robust statistics in high dimensions. In particular, viewed through our
proofs-to-private-algorithms lens, several well-studied SoS proofs from recent
works in algorithmic robust statistics directly yield key components of our
differentially private mean estimation algorithm.
Related papers
- Perturb-and-Project: Differentially Private Similarities and Marginals [73.98880839337873]
We revisit the input perturbations framework for differential privacy where noise is added to the input $Ain mathcalS$.
We first design novel efficient algorithms to privately release pair-wise cosine similarities.
We derive a novel algorithm to compute $k$-way marginal queries over $n$ features.
arXiv Detail & Related papers (2024-06-07T12:07:16Z) - Differentially-Private Hierarchical Clustering with Provable
Approximation Guarantees [79.59010418610625]
We study differentially private approximation algorithms for hierarchical clustering.
We show strong lower bounds for the problem: that any $epsilon$-DP algorithm must exhibit $O(|V|2/ epsilon)$-additive error for an input dataset.
We propose a private $1+o(1)$ approximation algorithm which also recovers the blocks exactly.
arXiv Detail & Related papers (2023-01-31T19:14:30Z) - Private estimation algorithms for stochastic block models and mixture
models [63.07482515700984]
General tools for designing efficient private estimation algorithms.
First efficient $(epsilon, delta)$-differentially private algorithm for both weak recovery and exact recovery.
arXiv Detail & Related papers (2023-01-11T09:12:28Z) - Scalable Differentially Private Clustering via Hierarchically Separated
Trees [82.69664595378869]
We show that our method computes a solution with cost at most $O(d3/2log n)cdot OPT + O(k d2 log2 n / epsilon2)$, where $epsilon$ is the privacy guarantee.
Although the worst-case guarantee is worse than that of state of the art private clustering methods, the algorithm we propose is practical.
arXiv Detail & Related papers (2022-06-17T09:24:41Z) - Robust Sparse Mean Estimation via Sum of Squares [42.526664955704746]
We study the problem of high-dimensional sparse mean estimation in the presence of an $epsilon$-fraction of adversarial outliers.
Our algorithms follow the Sum-of-Squares based, to algorithms approach.
arXiv Detail & Related papers (2022-06-07T16:49:54Z) - Faster Differentially Private Samplers via R\'enyi Divergence Analysis
of Discretized Langevin MCMC [35.050135428062795]
Langevin dynamics-based algorithms offer much faster alternatives under some distance measures such as statistical distance.
Our techniques simple and generic and apply to underdamped Langevin dynamics.
arXiv Detail & Related papers (2020-10-27T22:52:45Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z) - Locally Private Hypothesis Selection [96.06118559817057]
We output a distribution from $mathcalQ$ whose total variation distance to $p$ is comparable to the best such distribution.
We show that the constraint of local differential privacy incurs an exponential increase in cost.
Our algorithms result in exponential improvements on the round complexity of previous methods.
arXiv Detail & Related papers (2020-02-21T18:30:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.