Fast Optimal Locally Private Mean Estimation via Random Projections
- URL: http://arxiv.org/abs/2306.04444v2
- Date: Mon, 26 Jun 2023 22:19:51 GMT
- Title: Fast Optimal Locally Private Mean Estimation via Random Projections
- Authors: Hilal Asi, Vitaly Feldman, Jelani Nelson, Huy L. Nguyen, Kunal Talwar
- Abstract summary: We study the problem of locally private mean estimation of high-dimensional vectors in the Euclidean ball.
We propose a new algorithmic framework, ProjUnit, for private mean estimation.
Our framework is deceptively simple: each randomizer projects its input to a random low-dimensional subspace, normalizes the result, and then runs an optimal algorithm.
- Score: 58.603579803010796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the problem of locally private mean estimation of high-dimensional
vectors in the Euclidean ball. Existing algorithms for this problem either
incur sub-optimal error or have high communication and/or run-time complexity.
We propose a new algorithmic framework, ProjUnit, for private mean estimation
that yields algorithms that are computationally efficient, have low
communication complexity, and incur optimal error up to a $1+o(1)$-factor. Our
framework is deceptively simple: each randomizer projects its input to a random
low-dimensional subspace, normalizes the result, and then runs an optimal
algorithm such as PrivUnitG in the lower-dimensional space. In addition, we
show that, by appropriately correlating the random projection matrices across
devices, we can achieve fast server run-time. We mathematically analyze the
error of the algorithm in terms of properties of the random projections, and
study two instantiations. Lastly, our experiments for private mean estimation
and private federated learning demonstrate that our algorithms empirically
obtain nearly the same utility as optimal ones while having significantly lower
communication and computational cost.
Related papers
- Efficient distributed representations with linear-time attention scores normalization [3.8673630752805437]
We propose a linear-time approximation of the attention score normalization constants for embedding vectors with bounded norms.
The accuracy of our estimation formula surpasses competing kernel methods by even orders of magnitude.
The proposed algorithm is highly interpretable and easily adapted to an arbitrary embedding problem.
arXiv Detail & Related papers (2023-03-30T15:48:26Z) - Subset-Based Instance Optimality in Private Estimation [23.173651165908282]
We show how to construct private algorithms that achieve our notion of instance optimality when estimating a broad class of dataset properties.
Our algorithm simultaneously matches or exceeds the performance of existing algorithms under a range of distributional assumptions.
arXiv Detail & Related papers (2023-03-01T18:49:10Z) - Private estimation algorithms for stochastic block models and mixture
models [63.07482515700984]
General tools for designing efficient private estimation algorithms.
First efficient $(epsilon, delta)$-differentially private algorithm for both weak recovery and exact recovery.
arXiv Detail & Related papers (2023-01-11T09:12:28Z) - Scalable Differentially Private Clustering via Hierarchically Separated
Trees [82.69664595378869]
We show that our method computes a solution with cost at most $O(d3/2log n)cdot OPT + O(k d2 log2 n / epsilon2)$, where $epsilon$ is the privacy guarantee.
Although the worst-case guarantee is worse than that of state of the art private clustering methods, the algorithm we propose is practical.
arXiv Detail & Related papers (2022-06-17T09:24:41Z) - Optimal Algorithms for Mean Estimation under Local Differential Privacy [55.32262879188817]
We show that PrivUnit achieves the optimal variance among a large family of locally private randomizers.
We also develop a new variant of PrivUnit based on the Gaussian distribution which is more amenable to mathematical analysis and enjoys the same optimality guarantees.
arXiv Detail & Related papers (2022-05-05T06:43:46Z) - Differentially Private Clustering: Tight Approximation Ratios [57.89473217052714]
We give efficient differentially private algorithms for basic clustering problems.
Our results imply an improved algorithm for the Sample and Aggregate privacy framework.
One of the tools used in our 1-Cluster algorithm can be employed to get a faster quantum algorithm for ClosestPair in a moderate number of dimensions.
arXiv Detail & Related papers (2020-08-18T16:22:06Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.