Fully Adaptive Composition in Differential Privacy
- URL: http://arxiv.org/abs/2203.05481v3
- Date: Tue, 24 Oct 2023 14:37:58 GMT
- Title: Fully Adaptive Composition in Differential Privacy
- Authors: Justin Whitehouse and Aaditya Ramdas and Ryan Rogers and Zhiwei Steven
Wu
- Abstract summary: Well-known advanced composition theorems allow one to query a private database quadratically more times than basic privacy composition would permit.
We introduce fully adaptive composition, wherein both algorithms and their privacy parameters can be selected adaptively.
We construct filters that match the rates of advanced composition, including constants, despite allowing for adaptively chosen privacy parameters.
- Score: 53.01656650117495
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Composition is a key feature of differential privacy. Well-known advanced
composition theorems allow one to query a private database quadratically more
times than basic privacy composition would permit. However, these results
require that the privacy parameters of all algorithms be fixed before
interacting with the data. To address this, Rogers et al. introduced fully
adaptive composition, wherein both algorithms and their privacy parameters can
be selected adaptively. They defined two probabilistic objects to measure
privacy in adaptive composition: privacy filters, which provide differential
privacy guarantees for composed interactions, and privacy odometers,
time-uniform bounds on privacy loss. There are substantial gaps between
advanced composition and existing filters and odometers. First, existing
filters place stronger assumptions on the algorithms being composed. Second,
these odometers and filters suffer from large constants, making them
impractical. We construct filters that match the rates of advanced composition,
including constants, despite allowing for adaptively chosen privacy parameters.
En route we also derive a privacy filter for approximate zCDP. We also
construct several general families of odometers. These odometers match the
tightness of advanced composition at an arbitrary, preselected point in time,
or at all points in time simultaneously, up to a doubly-logarithmic factor. We
obtain our results by leveraging advances in martingale concentration. In sum,
we show that fully adaptive privacy is obtainable at almost no loss.
Related papers
- Provable Privacy with Non-Private Pre-Processing [56.770023668379615]
We propose a general framework to evaluate the additional privacy cost incurred by non-private data-dependent pre-processing algorithms.
Our framework establishes upper bounds on the overall privacy guarantees by utilising two new technical notions.
arXiv Detail & Related papers (2024-03-19T17:54:49Z) - Mean Estimation Under Heterogeneous Privacy Demands [5.755004576310333]
This work considers the problem of mean estimation, where each user can impose their own privacy level.
The algorithm we propose is shown to be minimax optimal and has a near-linear run-time.
Users with less but differing privacy requirements are all given more privacy than they require, in equal amounts.
arXiv Detail & Related papers (2023-10-19T20:29:19Z) - Theoretically Principled Federated Learning for Balancing Privacy and
Utility [61.03993520243198]
We propose a general learning framework for the protection mechanisms that protects privacy via distorting model parameters.
It can achieve personalized utility-privacy trade-off for each model parameter, on each client, at each communication round in federated learning.
arXiv Detail & Related papers (2023-05-24T13:44:02Z) - Mean Estimation Under Heterogeneous Privacy: Some Privacy Can Be Free [13.198689566654103]
This work considers the problem of mean estimation under heterogeneous Differential Privacy constraints.
The algorithm we propose is shown to be minimax optimal when there are two groups of users with distinct privacy levels.
arXiv Detail & Related papers (2023-04-27T05:23:06Z) - On Differential Privacy and Adaptive Data Analysis with Bounded Space [76.10334958368618]
We study the space complexity of the two related fields of differential privacy and adaptive data analysis.
We show that there exists a problem P that requires exponentially more space to be solved efficiently with differential privacy.
The line of work on adaptive data analysis focuses on understanding the number of samples needed for answering a sequence of adaptive queries.
arXiv Detail & Related papers (2023-02-11T14:45:31Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Individual Privacy Accounting for Differentially Private Stochastic Gradient Descent [69.14164921515949]
We characterize privacy guarantees for individual examples when releasing models trained by DP-SGD.
We find that most examples enjoy stronger privacy guarantees than the worst-case bound.
This implies groups that are underserved in terms of model utility simultaneously experience weaker privacy guarantees.
arXiv Detail & Related papers (2022-06-06T13:49:37Z) - Practical Privacy Filters and Odometers with R\'enyi Differential
Privacy and Applications to Differentially Private Deep Learning [0.0]
We study DP composition under adaptive privacy budgets through the lens of R'enyi Differential Privacy.
We prove a simpler composition theorem with smaller constants, making it practical enough to use in algorithm design.
arXiv Detail & Related papers (2021-03-02T00:37:11Z) - Individual Privacy Accounting via a Renyi Filter [33.65665839496798]
We give a method for tighter privacy loss accounting based on the value of a personalized privacy loss estimate for each individual.
Our filter is simpler and tighter than the known filter for $(epsilon,delta)$-differential privacy by Rogers et al.
arXiv Detail & Related papers (2020-08-25T17:49:48Z) - Bounding, Concentrating, and Truncating: Unifying Privacy Loss
Composition for Data Analytics [2.614355818010333]
We provide strong privacy loss bounds when an analyst may select pure DP, bounded range (e.g. exponential mechanisms) or concentrated DP mechanisms in any order.
We also provide optimal privacy loss bounds that apply when an analyst can select pure DP and bounded range mechanisms in a batch.
arXiv Detail & Related papers (2020-04-15T17:33:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.