Optimal and Differentially Private Data Acquisition: Central and Local
Mechanisms
- URL: http://arxiv.org/abs/2201.03968v2
- Date: Wed, 6 Sep 2023 02:30:02 GMT
- Title: Optimal and Differentially Private Data Acquisition: Central and Local
Mechanisms
- Authors: Alireza Fallah, Ali Makhdoumi, Azarakhsh Malekian, Asuman Ozdaglar
- Abstract summary: We consider a platform's problem of collecting data from privacy sensitive users to estimate an underlying parameter of interest.
We consider two popular differential privacy settings for providing privacy guarantees for the users: central and local.
We pose the mechanism design problem as the optimal selection of an estimator and payments that will elicit truthful reporting of users' privacy sensitivities.
- Score: 9.599356978682108
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider a platform's problem of collecting data from privacy sensitive
users to estimate an underlying parameter of interest. We formulate this
question as a Bayesian-optimal mechanism design problem, in which an individual
can share her (verifiable) data in exchange for a monetary reward or services,
but at the same time has a (private) heterogeneous privacy cost which we
quantify using differential privacy. We consider two popular differential
privacy settings for providing privacy guarantees for the users: central and
local. In both settings, we establish minimax lower bounds for the estimation
error and derive (near) optimal estimators for given heterogeneous privacy loss
levels for users. Building on this characterization, we pose the mechanism
design problem as the optimal selection of an estimator and payments that will
elicit truthful reporting of users' privacy sensitivities. Under a regularity
condition on the distribution of privacy sensitivities we develop efficient
algorithmic mechanisms to solve this problem in both privacy settings. Our
mechanism in the central setting can be implemented in time $\mathcal{O}(n \log
n)$ where $n$ is the number of users and our mechanism in the local setting
admits a Polynomial Time Approximation Scheme (PTAS).
Related papers
- Enhanced Privacy Bound for Shuffle Model with Personalized Privacy [32.08637708405314]
Differential Privacy (DP) is an enhanced privacy protocol which introduces an intermediate trusted server between local users and a central data curator.
It significantly amplifies the central DP guarantee by anonymizing and shuffling the local randomized data.
This work focuses on deriving the central privacy bound for a more practical setting where personalized local privacy is required by each user.
arXiv Detail & Related papers (2024-07-25T16:11:56Z) - Privacy Amplification for the Gaussian Mechanism via Bounded Support [64.86780616066575]
Data-dependent privacy accounting frameworks such as per-instance differential privacy (pDP) and Fisher information loss (FIL) confer fine-grained privacy guarantees for individuals in a fixed training dataset.
We propose simple modifications of the Gaussian mechanism with bounded support, showing that they amplify privacy guarantees under data-dependent accounting.
arXiv Detail & Related papers (2024-03-07T21:22:07Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - Privacy-Preserving Matrix Factorization for Recommendation Systems using
Gaussian Mechanism [2.84279467589473]
We propose a privacy-preserving recommendation system based on the differential privacy framework and matrix factorization.
As differential privacy is a powerful and robust mathematical framework for designing privacy-preserving machine learning algorithms, it is possible to prevent adversaries from extracting sensitive user information.
arXiv Detail & Related papers (2023-04-11T13:50:39Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - DP2-Pub: Differentially Private High-Dimensional Data Publication with
Invariant Post Randomization [58.155151571362914]
We propose a differentially private high-dimensional data publication mechanism (DP2-Pub) that runs in two phases.
splitting attributes into several low-dimensional clusters with high intra-cluster cohesion and low inter-cluster coupling helps obtain a reasonable privacy budget.
We also extend our DP2-Pub mechanism to the scenario with a semi-honest server which satisfies local differential privacy.
arXiv Detail & Related papers (2022-08-24T17:52:43Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Production of Categorical Data Verifying Differential Privacy:
Conception and Applications to Machine Learning [0.0]
Differential privacy is a formal definition that allows quantifying the privacy-utility trade-off.
With the local DP (LDP) model, users can sanitize their data locally before transmitting it to the server.
In all cases, we concluded that differentially private ML models achieve nearly the same utility metrics as non-private ones.
arXiv Detail & Related papers (2022-04-02T12:50:14Z) - Graph-Homomorphic Perturbations for Private Decentralized Learning [64.26238893241322]
Local exchange of estimates allows inference of data based on private data.
perturbations chosen independently at every agent, resulting in a significant performance loss.
We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible.
arXiv Detail & Related papers (2020-10-23T10:35:35Z) - Differential Privacy at Risk: Bridging Randomness and Privacy Budget [5.393465689287103]
We analyse roles of the sources of randomness, namely the explicit randomness induced by the noise distribution and the implicit randomness induced by the data-generation distribution.
We propose privacy at risk that is a probabilistic calibration of privacy-preserving mechanisms.
We show that composition using the cost optimal privacy at risk provides stronger privacy guarantee than the classical advanced composition.
arXiv Detail & Related papers (2020-03-02T15:44:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.