Privacy-Preserving ECG Data Analysis with Differential Privacy: A Literature Review and A Case Study
- URL: http://arxiv.org/abs/2406.13880v1
- Date: Wed, 19 Jun 2024 23:17:16 GMT
- Title: Privacy-Preserving ECG Data Analysis with Differential Privacy: A Literature Review and A Case Study
- Authors: Arin Ghazarian, Jianwei Zheng, Cyril Rakovski,
- Abstract summary: We provide an overview of key concepts in differential privacy, followed by a literature review and discussion of its application to ECG analysis.
In the second part of the paper, we explore how to implement differentially private query release on an arrhythmia database using a six-step process.
- Score: 1.1156009461711638
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Differential privacy has become the preeminent technique to protect the privacy of individuals in a database while allowing useful results from data analysis to be shared. Notably, it guarantees the amount of privacy loss in the worst-case scenario. Although many theoretical research papers have been published, practical real-life application of differential privacy demands estimating several important parameters without any clear solutions or guidelines. In the first part of the paper, we provide an overview of key concepts in differential privacy, followed by a literature review and discussion of its application to ECG analysis. In the second part of the paper, we explore how to implement differentially private query release on an arrhythmia database using a six-step process. We provide guidelines and discuss the related literature for all the steps involved, such as selection of the $\epsilon$ value, distribution of the total $\epsilon$ budget across the queries, and estimation of the sensitivity for the query functions. At the end, we discuss the shortcomings and challenges of applying differential privacy to ECG datasets.
Related papers
- A Statistical Viewpoint on Differential Privacy: Hypothesis Testing, Representation and Blackwell's Theorem [30.365274034429508]
We argue that differential privacy can be considered a textitpure statistical concept.
$f$-differential privacy is a unified framework for analyzing privacy bounds in data analysis and machine learning.
arXiv Detail & Related papers (2024-09-14T23:47:22Z) - Empirical Mean and Frequency Estimation Under Heterogeneous Privacy: A Worst-Case Analysis [5.755004576310333]
Differential Privacy (DP) is the current gold-standard for measuring privacy.
We consider the problems of empirical mean estimation for univariate data and frequency estimation for categorical data, subject to heterogeneous privacy constraints.
We prove some optimality results, under both PAC error and mean-squared error, for our proposed algorithms and demonstrate superior performance over other baseline techniques experimentally.
arXiv Detail & Related papers (2024-07-15T22:46:02Z) - Centering Policy and Practice: Research Gaps around Usable Differential Privacy [12.340264479496375]
We argue that while differential privacy is a clean formulation in theory, it poses significant challenges in practice.
To bridge the gaps between differential privacy's promises and its real-world usability, researchers and practitioners must work together.
arXiv Detail & Related papers (2024-06-17T21:32:30Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Private Domain Adaptation from a Public Source [48.83724068578305]
We design differentially private discrepancy-based algorithms for adaptation from a source domain with public labeled data to a target domain with unlabeled private data.
Our solutions are based on private variants of Frank-Wolfe and Mirror-Descent algorithms.
arXiv Detail & Related papers (2022-08-12T06:52:55Z) - Partial sensitivity analysis in differential privacy [58.730520380312676]
We investigate the impact of each input feature on the individual's privacy loss.
We experimentally evaluate our approach on queries over private databases.
We also explore our findings in the context of neural network training on synthetic data.
arXiv Detail & Related papers (2021-09-22T08:29:16Z) - Individual Privacy Accounting via a Renyi Filter [33.65665839496798]
We give a method for tighter privacy loss accounting based on the value of a personalized privacy loss estimate for each individual.
Our filter is simpler and tighter than the known filter for $(epsilon,delta)$-differential privacy by Rogers et al.
arXiv Detail & Related papers (2020-08-25T17:49:48Z) - Auditing Differentially Private Machine Learning: How Private is Private
SGD? [16.812900569416062]
We investigate whether Differentially Private SGD offers better privacy in practice than what is guaranteed by its state-of-the-art analysis.
We do so via novel data poisoning attacks, which we show correspond to realistic privacy attacks.
arXiv Detail & Related papers (2020-06-13T20:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.