Connecting Robust Shuffle Privacy and Pan-Privacy
- URL: http://arxiv.org/abs/2004.09481v4
- Date: Wed, 12 Aug 2020 03:45:00 GMT
- Title: Connecting Robust Shuffle Privacy and Pan-Privacy
- Authors: Victor Balcer, Albert Cheu, Matthew Joseph, and Jieming Mao
- Abstract summary: In the emphshuffle model of differential privacy, data-holding users send randomized messages to a secure shuffler, and the shuffler permutes the messages.
In the emphpan-private model, an algorithm processes a stream of data while maintaining an internal state that is differentially private with regard to the stream data.
Our results focus on emphrobustly shuffle private protocols, whose privacy guarantees are not greatly affected by malicious users.
- Score: 11.367579037903734
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the \emph{shuffle model} of differential privacy, data-holding users send
randomized messages to a secure shuffler, the shuffler permutes the messages,
and the resulting collection of messages must be differentially private with
regard to user data. In the \emph{pan-private} model, an algorithm processes a
stream of data while maintaining an internal state that is differentially
private with regard to the stream data. We give evidence connecting these two
apparently different models.
Our results focus on \emph{robustly} shuffle private protocols, whose privacy
guarantees are not greatly affected by malicious users. First, we give robustly
shuffle private protocols and upper bounds for counting distinct elements and
uniformity testing. Second, we use pan-private lower bounds to prove robustly
shuffle private lower bounds for both problems. Focusing on the dependence on
the domain size $k$, we find that robust approximate shuffle privacy and
approximate pan-privacy have additive error $\Theta(\sqrt{k})$ for counting
distinct elements. For uniformity testing, we give a robust approximate shuffle
private protocol with sample complexity $\tilde O(k^{2/3})$ and show that an
$\Omega(k^{2/3})$ dependence is necessary for any robust pure shuffle private
tester. Finally, we show that this connection is useful in both directions: we
give a pan-private adaptation of recent work on shuffle private histograms and
use it to recover further separations between pan-privacy and interactive local
privacy.
Related papers
- Enhanced Privacy Bound for Shuffle Model with Personalized Privacy [32.08637708405314]
Differential Privacy (DP) is an enhanced privacy protocol which introduces an intermediate trusted server between local users and a central data curator.
It significantly amplifies the central DP guarantee by anonymizing and shuffling the local randomized data.
This work focuses on deriving the central privacy bound for a more practical setting where personalized local privacy is required by each user.
arXiv Detail & Related papers (2024-07-25T16:11:56Z) - A Generalized Shuffle Framework for Privacy Amplification: Strengthening Privacy Guarantees and Enhancing Utility [4.7712438974100255]
We show how to shuffle $(epsilon_i,delta_i)$-PLDP setting with personalized privacy parameters.
We prove that shuffled $(epsilon_i,delta_i)$-PLDP process approximately preserves $mu$-Gaussian Differential Privacy with mu = sqrtfrac2sum_i=1n frac1-delta_i1+eepsilon_i-max_ifrac1-delta_i1+e
arXiv Detail & Related papers (2023-12-22T02:31:46Z) - Differentially Private Aggregation via Imperfect Shuffling [64.19885806149958]
We introduce the imperfect shuffle differential privacy model, where messages are shuffled in an almost uniform manner before being observed by a curator for private aggregation.
We show that surprisingly, there is no additional error overhead necessary in the imperfect shuffle model.
arXiv Detail & Related papers (2023-08-28T17:34:52Z) - Privacy Amplification via Shuffling: Unified, Simplified, and Tightened [20.10078781197001]
We propose a comprehensive framework for privacy amplification in both single-message and multi-message shuffle protocols.
Our theoretical results demonstrate that our framework provides tighter bounds, especially for local randomizers with extremal probability design.
Our bounds also result in a remarkably efficient $tildeO(n)$ algorithm that numerically amplifies privacy in less than $10$ seconds for $n=108$ users.
arXiv Detail & Related papers (2023-04-11T06:27:25Z) - Concurrent Shuffle Differential Privacy Under Continual Observation [60.127179363119936]
We study the private continual summation problem (a.k.a. the counter problem)
We show that the concurrent shuffle model allows for significantly improved error compared to a standard (single) shuffle model.
arXiv Detail & Related papers (2023-01-29T20:42:54Z) - Smooth Anonymity for Sparse Graphs [69.1048938123063]
differential privacy has emerged as the gold standard of privacy, however, when it comes to sharing sparse datasets.
In this work, we consider a variation of $k$-anonymity, which we call smooth-$k$-anonymity, and design simple large-scale algorithms that efficiently provide smooth-$k$-anonymity.
arXiv Detail & Related papers (2022-07-13T17:09:25Z) - Individual Privacy Accounting for Differentially Private Stochastic Gradient Descent [69.14164921515949]
We characterize privacy guarantees for individual examples when releasing models trained by DP-SGD.
We find that most examples enjoy stronger privacy guarantees than the worst-case bound.
This implies groups that are underserved in terms of model utility simultaneously experience weaker privacy guarantees.
arXiv Detail & Related papers (2022-06-06T13:49:37Z) - Privacy Amplification via Shuffling for Linear Contextual Bandits [51.94904361874446]
We study the contextual linear bandit problem with differential privacy (DP)
We show that it is possible to achieve a privacy/utility trade-off between JDP and LDP by leveraging the shuffle model of privacy.
Our result shows that it is possible to obtain a tradeoff between JDP and LDP by leveraging the shuffle model while preserving local privacy.
arXiv Detail & Related papers (2021-12-11T15:23:28Z) - Hiding Among the Clones: A Simple and Nearly Optimal Analysis of Privacy
Amplification by Shuffling [49.43288037509783]
We show that random shuffling amplifies differential privacy guarantees of locally randomized data.
Our result is based on a new approach that is simpler than previous work and extends to approximate differential privacy with nearly the same guarantees.
arXiv Detail & Related papers (2020-12-23T17:07:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.