A Unified Approach to Differentially Private Bayes Point Estimation
- URL: http://arxiv.org/abs/2211.10332v1
- Date: Fri, 18 Nov 2022 16:42:49 GMT
- Title: A Unified Approach to Differentially Private Bayes Point Estimation
- Authors: Braghadeesh Lakshminarayanan and Cristian R. Rojas
- Abstract summary: emphdifferential privacy (DP) has been proposed, which enforces confidentiality by introducing randomization in the estimates.
Standard algorithms for differentially private estimation are based on adding an appropriate amount of noise to the output of a traditional point estimation method.
We propose a new Unified Bayes Private Point (UBaPP) approach to Bayes point estimation of the unknown parameters of a data generating mechanism under a DP constraint.
- Score: 7.599399338954307
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Parameter estimation in statistics and system identification relies on data
that may contain sensitive information. To protect this sensitive information,
the notion of \emph{differential privacy} (DP) has been proposed, which
enforces confidentiality by introducing randomization in the estimates.
Standard algorithms for differentially private estimation are based on adding
an appropriate amount of noise to the output of a traditional point estimation
method. This leads to an accuracy-privacy trade off, as adding more noise
reduces the accuracy while increasing privacy. In this paper, we propose a new
Unified Bayes Private Point (UBaPP) approach to Bayes point estimation of the
unknown parameters of a data generating mechanism under a DP constraint, that
achieves a better accuracy-privacy trade off than traditional approaches. We
verify the performance of our approach on a simple numerical example.
Related papers
- Enhancing Feature-Specific Data Protection via Bayesian Coordinate Differential Privacy [55.357715095623554]
Local Differential Privacy (LDP) offers strong privacy guarantees without requiring users to trust external parties.
We propose a Bayesian framework, Bayesian Coordinate Differential Privacy (BCDP), that enables feature-specific privacy quantification.
arXiv Detail & Related papers (2024-10-24T03:39:55Z) - Privacy-Preserving Set-Based Estimation Using Differential Privacy and Zonotopes [2.206168301581203]
For large-scale cyber-physical systems, the collaboration of spatially distributed sensors is often needed to perform the state estimation process.
Privacy concerns arise from disclosing sensitive measurements to a cloud estimator.
We propose a differentially private set-based estimation protocol that guarantees true state containment in the estimated set and differential privacy for the sensitive measurements.
arXiv Detail & Related papers (2024-08-30T13:05:38Z) - Adaptive Differentially Quantized Subspace Perturbation (ADQSP): A Unified Framework for Privacy-Preserving Distributed Average Consensus [6.364764301218972]
We propose a general approach named adaptive differentially quantized subspace (ADQSP)
We show that by varying a single quantization parameter the proposed method can vary between SMPC-type performances and DP-type performances.
Our results show the potential of exploiting traditional distributed signal processing tools for providing cryptographic guarantees.
arXiv Detail & Related papers (2023-12-13T07:52:16Z) - Initialization Matters: Privacy-Utility Analysis of Overparameterized
Neural Networks [72.51255282371805]
We prove a privacy bound for the KL divergence between model distributions on worst-case neighboring datasets.
We find that this KL privacy bound is largely determined by the expected squared gradient norm relative to model parameters during training.
arXiv Detail & Related papers (2023-10-31T16:13:22Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Differentially Private Online Bayesian Estimation With Adaptive
Truncation [1.14219428942199]
We propose a novel online and adaptive truncation method for differentially private Bayesian online estimation of a static parameter regarding a population.
We aim to design predictive queries with small sensitivity, hence small privacy-preserving noise, enabling more accurate estimation while maintaining the same level of privacy.
arXiv Detail & Related papers (2023-01-19T17:53:53Z) - Differentially Private Stochastic Gradient Descent with Low-Noise [49.981789906200035]
Modern machine learning algorithms aim to extract fine-grained information from data to provide accurate predictions, which often conflicts with the goal of privacy protection.
This paper addresses the practical and theoretical importance of developing privacy-preserving machine learning algorithms that ensure good performance while preserving privacy.
arXiv Detail & Related papers (2022-09-09T08:54:13Z) - Locally Differentially Private Bayesian Inference [23.882144188177275]
Local differential privacy (LDP) has emerged as a technique of choice for privacy-preserving data collection in several scenarios when the aggregator is not trustworthy.
We provide a noise-aware probabilistic modeling framework, which allows Bayesian inference to take into account the noise added for privacy under LDP.
arXiv Detail & Related papers (2021-10-27T13:36:43Z) - Non-parametric Differentially Private Confidence Intervals for the
Median [3.205141100055992]
This paper proposes and evaluates several strategies to compute valid differentially private confidence intervals for the median.
We also illustrate that addressing both sources of uncertainty--the error from sampling and the error from protecting the output--should be preferred over simpler approaches that incorporate the uncertainty in a sequential fashion.
arXiv Detail & Related papers (2021-06-18T19:45:37Z) - RDP-GAN: A R\'enyi-Differential Privacy based Generative Adversarial
Network [75.81653258081435]
Generative adversarial network (GAN) has attracted increasing attention recently owing to its impressive ability to generate realistic samples with high privacy protection.
However, when GANs are applied on sensitive or private training examples, such as medical or financial records, it is still probable to divulge individuals' sensitive and private information.
We propose a R'enyi-differentially private-GAN (RDP-GAN), which achieves differential privacy (DP) in a GAN by carefully adding random noises on the value of the loss function during training.
arXiv Detail & Related papers (2020-07-04T09:51:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.