Enabling Humanitarian Applications with Targeted Differential Privacy
- URL: http://arxiv.org/abs/2408.13424v1
- Date: Sat, 24 Aug 2024 01:34:37 GMT
- Title: Enabling Humanitarian Applications with Targeted Differential Privacy
- Authors: Nitin Kohli, Joshua Blumenstock,
- Abstract summary: This paper develops an approach to implementing algorithmic decisions based on personal data.
It provides formal privacy guarantees to data subjects.
We show that stronger privacy guarantees typically come at some cost.
- Score: 0.39462888523270856
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The proliferation of mobile phones in low- and middle-income countries has suddenly and dramatically increased the extent to which the world's poorest and most vulnerable populations can be observed and tracked by governments and corporations. Millions of historically "off the grid" individuals are now passively generating digital data; these data, in turn, are being used to make life-altering decisions about those individuals -- including whether or not they receive government benefits, and whether they qualify for a consumer loan. This paper develops an approach to implementing algorithmic decisions based on personal data, while also providing formal privacy guarantees to data subjects. The approach adapts differential privacy to applications that require decisions about individuals, and gives decision makers granular control over the level of privacy guaranteed to data subjects. We show that stronger privacy guarantees typically come at some cost, and use data from two real-world applications -- an anti-poverty program in Togo and a consumer lending platform in Nigeria -- to illustrate those costs. Our empirical results quantify the tradeoff between privacy and predictive accuracy, and characterize how different privacy guarantees impact overall program effectiveness. More broadly, our results demonstrate a way for humanitarian programs to responsibly use personal data, and better equip program designers to make informed decisions about data privacy.
Related papers
- Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Evaluating the Effects of Digital Privacy Regulations on User Trust [0.0]
The study investigates the impact of digital privacy laws on user trust by comparing regulations in the Netherlands, Ghana, and Malaysia.
The main findings reveal that while the General Protection Regulation in the Netherlands is strict, its practical impact is limited by challenges enforcement.
In Ghana, Data Protection Act is underutilized due to low public awareness and insufficient enforcement, leading to reliance on personal protective measures.
In Malaysia, trust in digital services is largely dependent on the security practices of individual platforms rather than the Personal Data Protection Act.
arXiv Detail & Related papers (2024-09-04T11:11:41Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - Lomas: A Platform for Confidential Analysis of Private Data [0.0]
Lomas is a novel open-source platform designed to realize the full potential of the data held by public administrations.
It enables authorized users, such as approved researchers and government analysts, to execute algorithms on confidential datasets.
Lomas executes these algorithms without revealing the data to the user and returns the results protected by Differential Privacy.
arXiv Detail & Related papers (2024-06-24T19:16:58Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - Assessing Mobile Application Privacy: A Quantitative Framework for Privacy Measurement [0.0]
This work aims to contribute to a digital environment that prioritizes privacy, promotes informed decision-making, and endorses the privacy-preserving design principles.
The purpose of this framework is to systematically evaluate the level of privacy risk when using particular Android applications.
arXiv Detail & Related papers (2023-10-31T18:12:19Z) - Big Data Privacy in Emerging Market Fintech and Financial Services: A Research Agenda [0.9310318514564271]
White paper describes a research agenda to advance our understanding of the problem and solution of data privacy in emerging market and financial services.
We highlight five priority areas for research: comprehensive analyses; understanding local definitions of data privacy'; documenting key sources of risk, and potential technical solutions.
We hope this research agenda will focus attention on the multi-faceted nature of privacy in emerging markets.
arXiv Detail & Related papers (2023-10-08T02:11:19Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining [75.25943383604266]
We question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving.
We caution that publicizing these models pretrained on Web data as "private" could lead to harm and erode the public's trust in differential privacy as a meaningful definition of privacy.
We conclude by discussing potential paths forward for the field of private learning, as public pretraining becomes more popular and powerful.
arXiv Detail & Related papers (2022-12-13T10:41:12Z) - PCAL: A Privacy-preserving Intelligent Credit Risk Modeling Framework
Based on Adversarial Learning [111.19576084222345]
This paper proposes a framework of Privacy-preserving Credit risk modeling based on Adversarial Learning (PCAL)
PCAL aims to mask the private information inside the original dataset, while maintaining the important utility information for the target prediction task performance.
Results indicate that PCAL can learn an effective, privacy-free representation from user data, providing a solid foundation towards privacy-preserving machine learning for credit risk analysis.
arXiv Detail & Related papers (2020-10-06T07:04:59Z) - Utility-aware Privacy-preserving Data Releasing [7.462336024223669]
We propose a two-step perturbation-based privacy-preserving data releasing framework.
First, certain predefined privacy and utility problems are learned from the public domain data.
We then leverage the learned knowledge to precisely perturb the data owners' data into privatized data.
arXiv Detail & Related papers (2020-05-09T05:32:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.