Differentially Private Correlation Clustering
- URL: http://arxiv.org/abs/2102.08885v1
- Date: Wed, 17 Feb 2021 17:27:48 GMT
- Title: Differentially Private Correlation Clustering
- Authors: Mark Bun, Marek Eli\'a\v{s}, Janardhan Kulkarni
- Abstract summary: Correlation clustering is a widely used technique in unsupervised machine learning.
Motivated by applications where individual privacy is a concern, we initiate the study of differentially private correlation clustering.
- Score: 23.93805663525436
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Correlation clustering is a widely used technique in unsupervised machine
learning. Motivated by applications where individual privacy is a concern, we
initiate the study of differentially private correlation clustering. We propose
an algorithm that achieves subquadratic additive error compared to the optimal
cost. In contrast, straightforward adaptations of existing non-private
algorithms all lead to a trivial quadratic error. Finally, we give a lower
bound showing that any pure differentially private algorithm for correlation
clustering requires additive error of $\Omega(n)$.
Related papers
- Making Old Things New: A Unified Algorithm for Differentially Private Clustering [6.320600791108065]
We show that a 20-year-old algorithm can be slightly modified to work for any of these models.
This provides a unified picture: while matching almost all previously known results, it allows us to improve some of them.
arXiv Detail & Related papers (2024-06-17T15:31:53Z) - Differentially-Private Hierarchical Clustering with Provable
Approximation Guarantees [79.59010418610625]
We study differentially private approximation algorithms for hierarchical clustering.
We show strong lower bounds for the problem: that any $epsilon$-DP algorithm must exhibit $O(|V|2/ epsilon)$-additive error for an input dataset.
We propose a private $1+o(1)$ approximation algorithm which also recovers the blocks exactly.
arXiv Detail & Related papers (2023-01-31T19:14:30Z) - Differentially Private Stochastic Gradient Descent with Low-Noise [49.981789906200035]
Modern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection.
This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy.
arXiv Detail & Related papers (2022-09-09T08:54:13Z) - Near-Optimal Correlation Clustering with Privacy [37.94795032297396]
Correlation clustering is a central problem in unsupervised learning.
In this paper, we introduce a simple and computationally efficient algorithm for the correlation clustering problem with provable privacy guarantees.
arXiv Detail & Related papers (2022-03-02T22:30:19Z) - Differentially-Private Clustering of Easy Instances [67.04951703461657]
In differentially private clustering, the goal is to identify $k$ cluster centers without disclosing information on individual data points.
We provide implementable differentially private clustering algorithms that provide utility when the data is "easy"
We propose a framework that allows us to apply non-private clustering algorithms to the easy instances and privately combine the results.
arXiv Detail & Related papers (2021-12-29T08:13:56Z) - Learning with User-Level Privacy [61.62978104304273]
We analyze algorithms to solve a range of learning tasks under user-level differential privacy constraints.
Rather than guaranteeing only the privacy of individual samples, user-level DP protects a user's entire contribution.
We derive an algorithm that privately answers a sequence of $K$ adaptively chosen queries with privacy cost proportional to $tau$, and apply it to solve the learning tasks we consider.
arXiv Detail & Related papers (2021-02-23T18:25:13Z) - A note on differentially private clustering with large additive error [15.508460240818575]
We describe a simple approach to obtain a differentially private algorithm for kclustering with nearly the same additive error.
The approach is the combination of a simple observation independent of privacy consideration and any existing private algorithm with a constant approximation.
arXiv Detail & Related papers (2020-09-28T13:40:04Z) - Differentially Private Clustering via Maximum Coverage [7.059472280274009]
We study the problem of clustering in metric spaces while preserving the privacy of individual data.
We present differential algorithms with constant multiplicative error and lower additive error.
arXiv Detail & Related papers (2020-08-27T22:11:18Z) - Differentially Private Clustering: Tight Approximation Ratios [57.89473217052714]
We give efficient differentially private algorithms for basic clustering problems.
Our results imply an improved algorithm for the Sample and Aggregate privacy framework.
One of the tools used in our 1-Cluster algorithm can be employed to get a faster quantum algorithm for ClosestPair in a moderate number of dimensions.
arXiv Detail & Related papers (2020-08-18T16:22:06Z) - Tighter Generalization Bounds for Iterative Differentially Private
Learning Algorithms [95.73230376153872]
This paper studies the relationship between generalization and privacy preservation in iterative learning algorithms by two sequential steps.
We prove that $(varepsilon, delta)$-differential privacy implies an on-average generalization bound for multi-Database learning algorithms.
We then investigate how the iterative nature shared by most learning algorithms influence privacy preservation and further generalization.
arXiv Detail & Related papers (2020-07-18T09:12:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.