A note on differentially private clustering with large additive error
- URL: http://arxiv.org/abs/2009.13317v1
- Date: Mon, 28 Sep 2020 13:40:04 GMT
- Title: A note on differentially private clustering with large additive error
- Authors: Huy L. Nguyen
- Abstract summary: We describe a simple approach to obtain a differentially private algorithm for kclustering with nearly the same additive error.
The approach is the combination of a simple observation independent of privacy consideration and any existing private algorithm with a constant approximation.
- Score: 15.508460240818575
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this note, we describe a simple approach to obtain a differentially
private algorithm for k-clustering with nearly the same multiplicative factor
as any non-private counterpart at the cost of a large polynomial additive
error. The approach is the combination of a simple geometric observation
independent of privacy consideration and any existing private algorithm with a
constant approximation.
Related papers
- Differentially-Private Hierarchical Clustering with Provable
Approximation Guarantees [79.59010418610625]
We study differentially private approximation algorithms for hierarchical clustering.
We show strong lower bounds for the problem: that any $epsilon$-DP algorithm must exhibit $O(|V|2/ epsilon)$-additive error for an input dataset.
We propose a private $1+o(1)$ approximation algorithm which also recovers the blocks exactly.
arXiv Detail & Related papers (2023-01-31T19:14:30Z) - Robustness Implies Privacy in Statistical Estimation [16.061651295129302]
We study the relationship between adversarial and differential privacy in high-dimensional statistics.
We give the first blackbox reduction from privacy to robustness which can produce private estimators with optimal tradeoffs.
Our algorithms are also robust to a nearly optimal fraction of adversarially-corrupted samples.
arXiv Detail & Related papers (2022-12-09T18:07:30Z) - Private Domain Adaptation from a Public Source [48.83724068578305]
We design differentially private discrepancy-based algorithms for adaptation from a source domain with public labeled data to a target domain with unlabeled private data.
Our solutions are based on private variants of Frank-Wolfe and Mirror-Descent algorithms.
arXiv Detail & Related papers (2022-08-12T06:52:55Z) - Personalized Federated Learning via Convex Clustering [72.15857783681658]
We propose a family of algorithms for personalized federated learning with locally convex user costs.
The proposed framework is based on a generalization of convex clustering in which the differences between different users' models are penalized.
arXiv Detail & Related papers (2022-02-01T19:25:31Z) - Differentially-Private Clustering of Easy Instances [67.04951703461657]
In differentially private clustering, the goal is to identify $k$ cluster centers without disclosing information on individual data points.
We provide implementable differentially private clustering algorithms that provide utility when the data is "easy"
We propose a framework that allows us to apply non-private clustering algorithms to the easy instances and privately combine the results.
arXiv Detail & Related papers (2021-12-29T08:13:56Z) - Differentially Private Algorithms for Clustering with Stability
Assumptions [0.76146285961466]
We present a far simpler algorithm for clustering stable inputs.
We analyze its utility in both the Wasserstein distance and the k-means cost.
Our algorithm has straight-forward analogues for "nice" k-median instances and for the local-model of differential privacy.
arXiv Detail & Related papers (2021-06-11T00:45:39Z) - Differentially Private Correlation Clustering [23.93805663525436]
Correlation clustering is a widely used technique in unsupervised machine learning.
Motivated by applications where individual privacy is a concern, we initiate the study of differentially private correlation clustering.
arXiv Detail & Related papers (2021-02-17T17:27:48Z) - Differentially Private Clustering via Maximum Coverage [7.059472280274009]
We study the problem of clustering in metric spaces while preserving the privacy of individual data.
We present differential algorithms with constant multiplicative error and lower additive error.
arXiv Detail & Related papers (2020-08-27T22:11:18Z) - Differentially Private Clustering: Tight Approximation Ratios [57.89473217052714]
We give efficient differentially private algorithms for basic clustering problems.
Our results imply an improved algorithm for the Sample and Aggregate privacy framework.
One of the tools used in our 1-Cluster algorithm can be employed to get a faster quantum algorithm for ClosestPair in a moderate number of dimensions.
arXiv Detail & Related papers (2020-08-18T16:22:06Z) - Tighter Generalization Bounds for Iterative Differentially Private
Learning Algorithms [95.73230376153872]
This paper studies the relationship between generalization and privacy preservation in iterative learning algorithms by two sequential steps.
We prove that $(varepsilon, delta)$-differential privacy implies an on-average generalization bound for multi-Database learning algorithms.
We then investigate how the iterative nature shared by most learning algorithms influence privacy preservation and further generalization.
arXiv Detail & Related papers (2020-07-18T09:12:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.