Position: Challenges and Opportunities for Differential Privacy in the U.S. Federal Government
- URL: http://arxiv.org/abs/2410.16423v1
- Date: Mon, 21 Oct 2024 18:46:05 GMT
- Title: Position: Challenges and Opportunities for Differential Privacy in the U.S. Federal Government
- Authors: Amol Khanna, Adam McCormick, Andre Nguyen, Chris Aguirre, Edward Raff,
- Abstract summary: We seek to elucidate challenges and opportunities for differential privacy within the federal government setting.
We highlight three significant challenges which currently restrict the use of differential privacy in the U.S. government.
We provide two examples where differential privacy can enhance the capabilities of government agencies.
- Score: 34.255047514441195
- License:
- Abstract: In this article, we seek to elucidate challenges and opportunities for differential privacy within the federal government setting, as seen by a team of differential privacy researchers, privacy lawyers, and data scientists working closely with the U.S. government. After introducing differential privacy, we highlight three significant challenges which currently restrict the use of differential privacy in the U.S. government. We then provide two examples where differential privacy can enhance the capabilities of government agencies. The first example highlights how the quantitative nature of differential privacy allows policy security officers to release multiple versions of analyses with different levels of privacy. The second example, which we believe is a novel realization, indicates that differential privacy can be used to improve staffing efficiency in classified applications. We hope that this article can serve as a nontechnical resource which can help frame future action from the differential privacy community, privacy regulators, security officers, and lawmakers.
Related papers
- Differential Privacy Overview and Fundamental Techniques [63.0409690498569]
This chapter is meant to be part of the book "Differential Privacy in Artificial Intelligence: From Theory to Practice"
It starts by illustrating various attempts to protect data privacy, emphasizing where and why they failed.
It then defines the key actors, tasks, and scopes that make up the domain of privacy-preserving data analysis.
arXiv Detail & Related papers (2024-11-07T13:52:11Z) - Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Formalization of Differential Privacy in Isabelle/HOL [0.16574413179773761]
We propose an Isabelle/HOL library for formalizing differential privacy in a general setting.
To our knowledge, it is the first formalization of differential privacy that supports continuous probability distributions.
arXiv Detail & Related papers (2024-10-20T13:06:13Z) - Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy [14.40391109414476]
We design and evaluate new explanations of differential privacy for the local and central models.
We find that consequences-focused explanations in the style of privacy nutrition labels are a promising approach for setting accurate privacy expectations.
arXiv Detail & Related papers (2024-08-16T01:21:57Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - Equity and Privacy: More Than Just a Tradeoff [10.545898004301323]
Recent work has shown that privacy preserving data publishing can introduce different levels of utility across different population groups.
Will marginal populations see disproportionately less utility from privacy technology?
If there is an inequity how can we address it?
arXiv Detail & Related papers (2021-11-08T17:39:32Z) - "I need a better description'': An Investigation Into User Expectations
For Differential Privacy [31.352325485393074]
We explore users' privacy expectations related to differential privacy.
We find that users care about the kinds of information leaks against which differential privacy protects.
We find that the ways in which differential privacy is described in-the-wild haphazardly set users' privacy expectations.
arXiv Detail & Related papers (2021-10-13T02:36:37Z) - Applications of Differential Privacy in Social Network Analysis: A
Survey [60.696428840516724]
Differential privacy is effective in sharing information and preserving privacy with a strong guarantee.
Social network analysis has been extensively adopted in many applications, opening a new arena for the application of differential privacy.
arXiv Detail & Related papers (2020-10-06T19:06:03Z) - More Than Privacy: Applying Differential Privacy in Key Areas of
Artificial Intelligence [62.3133247463974]
We show that differential privacy can do more than just privacy preservation in AI.
It can also be used to improve security, stabilize learning, build fair models, and impose composition in selected areas of AI.
arXiv Detail & Related papers (2020-08-05T03:07:36Z) - Auditing Differentially Private Machine Learning: How Private is Private
SGD? [16.812900569416062]
We investigate whether Differentially Private SGD offers better privacy in practice than what is guaranteed by its state-of-the-art analysis.
We do so via novel data poisoning attacks, which we show correspond to realistic privacy attacks.
arXiv Detail & Related papers (2020-06-13T20:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.