Designing Differentially Private Estimators in High Dimensions
- URL: http://arxiv.org/abs/2006.01944v3
- Date: Sat, 18 Jul 2020 17:01:13 GMT
- Title: Designing Differentially Private Estimators in High Dimensions
- Authors: Aditya Dhar, Jason Huang
- Abstract summary: We study differentially private mean estimation in a high-dimensional setting.
Recent work in high-dimensional robust statistics has identified computationally tractable mean estimation algorithms.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study differentially private mean estimation in a high-dimensional
setting. Existing differential privacy techniques applied to large dimensions
lead to computationally intractable problems or estimators with excessive
privacy loss. Recent work in high-dimensional robust statistics has identified
computationally tractable mean estimation algorithms with asymptotic
dimension-independent error guarantees. We incorporate these results to develop
a strict bound on the global sensitivity of the robust mean estimator. This
yields a computationally tractable algorithm for differentially private mean
estimation in high dimensions with dimension-independent privacy loss. Finally,
we show on synthetic data that our algorithm significantly outperforms classic
differential privacy methods, overcoming barriers to high-dimensional
differential privacy.
Related papers
- Privacy-Preserving Set-Based Estimation Using Differential Privacy and Zonotopes [2.206168301581203]
For large-scale cyber-physical systems, the collaboration of spatially distributed sensors is often needed to perform the state estimation process.
Privacy concerns arise from disclosing sensitive measurements to a cloud estimator.
We propose a differentially private set-based estimation protocol that guarantees true state containment in the estimated set and differential privacy for the sensitive measurements.
arXiv Detail & Related papers (2024-08-30T13:05:38Z) - Differentially Private Sliced Inverse Regression: Minimax Optimality and
Algorithm [16.14032140601778]
We propose optimally differentially private algorithms designed to address privacy concerns in the context of sufficient dimension reduction.
We develop differentially private algorithms that achieve the minimax lower bounds up to logarithmic factors.
As a natural extension, we can readily offer analogous lower and upper bounds for differentially private sparse principal component analysis.
arXiv Detail & Related papers (2024-01-16T06:47:43Z) - Optimal Private Discrete Distribution Estimation with One-bit Communication [63.413106413939836]
We consider a private discrete distribution estimation problem with one-bit communication constraint.
We characterize the first-orders of the worst-case trade-off under the one-bit communication constraint.
These results demonstrate the optimal dependence of the privacy-utility trade-off under the one-bit communication constraint.
arXiv Detail & Related papers (2023-10-17T05:21:19Z) - Robustness Implies Privacy in Statistical Estimation [16.061651295129302]
We study the relationship between adversarial and differential privacy in high-dimensional statistics.
We give the first blackbox reduction from privacy to robustness which can produce private estimators with optimal tradeoffs.
Our algorithms are also robust to a nearly optimal fraction of adversarially-corrupted samples.
arXiv Detail & Related papers (2022-12-09T18:07:30Z) - Differentially private multivariate medians [4.588028371034407]
We develop novel finite-sample performance guarantees for differentially private depth-based medians.
We show that under Cauchy marginals, the cost of heavy-tailed location estimation outweighs the cost of privacy.
arXiv Detail & Related papers (2022-10-12T17:56:04Z) - On the Statistical Complexity of Estimation and Testing under Privacy Constraints [17.04261371990489]
We show how to characterize the power of a statistical test under differential privacy in a plug-and-play fashion.
We show that maintaining privacy results in a noticeable reduction in performance only when the level of privacy protection is very high.
Finally, we demonstrate that the DP-SGLD algorithm, a private convex solver, can be employed for maximum likelihood estimation with a high degree of confidence.
arXiv Detail & Related papers (2022-10-05T12:55:53Z) - Differentially Private Stochastic Gradient Descent with Low-Noise [49.981789906200035]
Modern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection.
This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy.
arXiv Detail & Related papers (2022-09-09T08:54:13Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Private Domain Adaptation from a Public Source [48.83724068578305]
We design differentially private discrepancy-based algorithms for adaptation from a source domain with public labeled data to a target domain with unlabeled private data.
Our solutions are based on private variants of Frank-Wolfe and Mirror-Descent algorithms.
arXiv Detail & Related papers (2022-08-12T06:52:55Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.