The Correlated Gaussian Sparse Histogram Mechanism
- URL: http://arxiv.org/abs/2412.10357v1
- Date: Fri, 13 Dec 2024 18:51:33 GMT
- Title: The Correlated Gaussian Sparse Histogram Mechanism
- Authors: Christian Janos Lebeda, Lukas Retschmeier,
- Abstract summary: We consider the problem of releasing a sparse histogram under $(varepsilon, delta)$-differential privacy.
We adopt Lebeda's technique and show that adding correlated noise to the non-zero counts only allows us to reduce the magnitude of noise when we have a sparsity bound.
- Score: 1.2277343096128712
- License:
- Abstract: We consider the problem of releasing a sparse histogram under $(\varepsilon, \delta)$-differential privacy. The stability histogram independently adds noise from a Laplace or Gaussian distribution to the non-zero entries and removes those noisy counts below a threshold. Thereby, the introduction of new non-zero values between neighboring histograms is only revealed with probability at most $\delta$, and typically, the value of the threshold dominates the error of the mechanism. We consider the variant of the stability histogram with Gaussian noise. Recent works ([Joseph and Yu, COLT '24] and [Lebeda, SOSA '25]) reduced the error for private histograms using correlated Gaussian noise. However, these techniques can not be directly applied in the very sparse setting. Instead, we adopt Lebeda's technique and show that adding correlated noise to the non-zero counts only allows us to reduce the magnitude of noise when we have a sparsity bound. This, in turn, allows us to use a lower threshold by up to a factor of $1/2$ compared to the non-correlated noise mechanism. We then extend our mechanism to a setting without a known bound on sparsity. Additionally, we show that correlated noise can give a similar improvement for the more practical discrete Gaussian mechanism.
Related papers
- Sign Operator for Coping with Heavy-Tailed Noise: High Probability Convergence Bounds with Extensions to Distributed Optimization and Comparison Oracle [77.3806516979843]
We show that SignSGD achieves optimal sample complexity $tildeO(varepsilon-frac3kappa - 2kappa 1right)$ with high accuracy.
We also explore the application of the sign operator in zeroth-order optimization with an oracle that can only compare function values at two different points.
arXiv Detail & Related papers (2025-02-11T19:54:11Z) - Count on Your Elders: Laplace vs Gaussian Noise [9.546521474972485]
We argue that Laplace noise may in fact be preferable to Gaussian noise in many settings.
We show that the noise added by the Gaussian mechanism can always be replaced by Laplace noise of comparable variance.
This challenges the conventional wisdom that Gaussian noise should be used for high-dimensional noise.
arXiv Detail & Related papers (2024-08-13T16:36:33Z) - Better Gaussian Mechanism using Correlated Noise [1.450405446885067]
We show that adding a random variable distributed as a Gaussian with variance scaled by $(sqrtd + 1)/4$ to all counts allows us to reduce the variance of the independent noise samples to scale only with $(d + sqrtd)/4$.
The central idea of our mechanism is simple and the technique is flexible.
arXiv Detail & Related papers (2024-08-13T12:31:03Z) - Revisiting Convergence of AdaGrad with Relaxed Assumptions [4.189643331553922]
We revisit the convergence of AdaGrad with momentum (covering AdaGrad as a special case) on problems.
This model encompasses a broad range noises including sub-auau in many practical applications.
arXiv Detail & Related papers (2024-02-21T13:24:14Z) - Some Constructions of Private, Efficient, and Optimal $K$-Norm and Elliptic Gaussian Noise [54.34628844260993]
Differentially private computation often begins with a bound on some $d$-dimensional statistic's sensitivity.
For pure differential privacy, the $K$-norm mechanism can improve on this approach using a norm tailored to the statistic's sensitivity space.
This paper solves both problems for the simple statistics of sum, count, and vote.
arXiv Detail & Related papers (2023-09-27T17:09:36Z) - Anonymized Histograms in Intermediate Privacy Models [54.32252900997422]
We provide an algorithm with a nearly matching error guarantee of $tildeO_varepsilon(sqrtn)$ in the shuffle DP and pan-private models.
Our algorithm is very simple: it just post-processes the discrete Laplace-noised histogram.
arXiv Detail & Related papers (2022-10-27T05:11:00Z) - Differential privacy for symmetric log-concave mechanisms [0.0]
Adding random noise to database query results is an important tool for achieving privacy.
We provide a sufficient and necessary condition for $(epsilon, delta)$-differential privacy for all symmetric and log-concave noise densities.
arXiv Detail & Related papers (2022-02-23T10:20:29Z) - Random quantum circuits transform local noise into global white noise [118.18170052022323]
We study the distribution over measurement outcomes of noisy random quantum circuits in the low-fidelity regime.
For local noise that is sufficiently weak and unital, correlations (measured by the linear cross-entropy benchmark) between the output distribution $p_textnoisy$ of a generic noisy circuit instance shrink exponentially.
If the noise is incoherent, the output distribution approaches the uniform distribution $p_textunif$ at precisely the same rate.
arXiv Detail & Related papers (2021-11-29T19:26:28Z) - Fast Dimension Independent Private AdaGrad on Publicly Estimated
Subspaces [24.52697154861819]
We show that noisy AdaGrad achieves a regret comparable to traditional AdaGrad plus a well-controlled term due to noise.
We operate with general convex functions in both constrained and unconstrained minimization.
arXiv Detail & Related papers (2020-08-14T20:46:38Z) - Shape Matters: Understanding the Implicit Bias of the Noise Covariance [76.54300276636982]
Noise in gradient descent provides a crucial implicit regularization effect for training over parameterized models.
We show that parameter-dependent noise -- induced by mini-batches or label perturbation -- is far more effective than Gaussian noise.
Our analysis reveals that parameter-dependent noise introduces a bias towards local minima with smaller noise variance, whereas spherical Gaussian noise does not.
arXiv Detail & Related papers (2020-06-15T18:31:02Z) - Contextual Linear Bandits under Noisy Features: Towards Bayesian Oracles [65.9694455739978]
We study contextual linear bandit problems under feature uncertainty, where the features are noisy and have missing entries.
Our analysis reveals that the optimal hypothesis can significantly deviate from the underlying realizability function, depending on the noise characteristics.
This implies that classical approaches cannot guarantee a non-trivial regret bound.
arXiv Detail & Related papers (2017-03-03T21:39:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.