Differentially Private Estimation of Hawkes Process
- URL: http://arxiv.org/abs/2209.07303v1
- Date: Thu, 15 Sep 2022 13:59:23 GMT
- Title: Differentially Private Estimation of Hawkes Process
- Authors: Simiao Zuo, Tianyi Liu, Tuo Zhao, Hongyuan Zha
- Abstract summary: We introduce a rigorous definition of differential privacy for event stream data based on a discretized representation of the Hawkes process.
We then propose two differentially private optimization algorithms, which can efficiently estimate Hawkes process models with the desired privacy and utility guarantees.
- Score: 81.20710494974281
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Point process models are of great importance in real world applications. In
certain critical applications, estimation of point process models involves
large amounts of sensitive personal data from users. Privacy concerns naturally
arise which have not been addressed in the existing literature. To bridge this
glaring gap, we propose the first general differentially private estimation
procedure for point process models. Specifically, we take the Hawkes process as
an example, and introduce a rigorous definition of differential privacy for
event stream data based on a discretized representation of the Hawkes process.
We then propose two differentially private optimization algorithms, which can
efficiently estimate Hawkes process models with the desired privacy and utility
guarantees under two different settings. Experiments are provided to back up
our theoretical analysis.
Related papers
- Privacy-Aware Randomized Quantization via Linear Programming [13.002534825666219]
We propose a family of quantization mechanisms that is unbiased and differentially private.
Our proposed mechanism can attain a better privacy-accuracy trade-off compared to baselines.
arXiv Detail & Related papers (2024-06-01T18:40:08Z) - Naturally Private Recommendations with Determinantal Point Processes [0.6249768559720122]
We discuss Determinantal Point Processes (DPPs) which balance recommendations based on both the popularity and the diversity of the content.
We conclude by proposing simple alternatives to DPPs which would make them more efficient with respect to their privacy-utility trade-off.
arXiv Detail & Related papers (2024-05-22T14:20:56Z) - Provable Privacy with Non-Private Pre-Processing [56.770023668379615]
We propose a general framework to evaluate the additional privacy cost incurred by non-private data-dependent pre-processing algorithms.
Our framework establishes upper bounds on the overall privacy guarantees by utilising two new technical notions.
arXiv Detail & Related papers (2024-03-19T17:54:49Z) - Differentially Private Linear Regression with Linked Data [3.9325957466009203]
Differential privacy, a mathematical notion from computer science, is a rising tool offering robust privacy guarantees.
Recent work focuses on developing differentially private versions of individual statistical and machine learning tasks.
We present two differentially private algorithms for linear regression with linked data.
arXiv Detail & Related papers (2023-08-01T21:00:19Z) - Privacy-aware Gaussian Process Regression [5.837881923712394]
The proposed method can be used when a data owner is unwilling to share a high-fidelity supervised learning model built from their data with the public due to privacy concerns.
The key idea of the proposed method is to add synthetic noise to the data until the predictive variance of the Gaussian process model reaches a prespecified privacy level.
arXiv Detail & Related papers (2023-05-25T23:44:31Z) - DP2-Pub: Differentially Private High-Dimensional Data Publication with
Invariant Post Randomization [58.155151571362914]
We propose a differentially private high-dimensional data publication mechanism (DP2-Pub) that runs in two phases.
splitting attributes into several low-dimensional clusters with high intra-cluster cohesion and low inter-cluster coupling helps obtain a reasonable privacy budget.
We also extend our DP2-Pub mechanism to the scenario with a semi-honest server which satisfies local differential privacy.
arXiv Detail & Related papers (2022-08-24T17:52:43Z) - Post-processing of Differentially Private Data: A Fairness Perspective [53.29035917495491]
This paper shows that post-processing causes disparate impacts on individuals or groups.
It analyzes two critical settings: the release of differentially private datasets and the use of such private datasets for downstream decisions.
It proposes a novel post-processing mechanism that is (approximately) optimal under different fairness metrics.
arXiv Detail & Related papers (2022-01-24T02:45:03Z) - Privacy preserving n-party scalar product protocol [0.0]
Privacy-preserving machine learning enables the training of models on decentralized datasets without the need to reveal the data.
The privacy preserving scalar product protocol, which enables the dot product of vectors without revealing them, is one popular example for its versatility.
We propose a generalization of the protocol for an arbitrary number of parties, based on an existing two-party method.
arXiv Detail & Related papers (2021-12-17T11:14:53Z) - Private Prediction Sets [72.75711776601973]
Machine learning systems need reliable uncertainty quantification and protection of individuals' privacy.
We present a framework that treats these two desiderata jointly.
We evaluate the method on large-scale computer vision datasets.
arXiv Detail & Related papers (2021-02-11T18:59:11Z) - Bias and Variance of Post-processing in Differential Privacy [53.29035917495491]
Post-processing immunity is a fundamental property of differential privacy.
It is often argued that post-processing may introduce bias and increase variance.
This paper takes a first step towards understanding the properties of post-processing.
arXiv Detail & Related papers (2020-10-09T02:12:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.