IDPFilter: Mitigating Interdependent Privacy Issues in Third-Party Apps
- URL: http://arxiv.org/abs/2405.01411v1
- Date: Thu, 2 May 2024 16:02:13 GMT
- Title: IDPFilter: Mitigating Interdependent Privacy Issues in Third-Party Apps
- Authors: Shuaishuai Liu, Gergely Biczók,
- Abstract summary: Third-party apps have increased concerns about interdependent privacy (IDP)
This paper provides a comprehensive investigation into the previously underinvestigated IDP issues of third-party apps.
We propose IDPFilter, a platform-agnostic API that enables application providers to minimize collateral information collection.
- Score: 0.30693357740321775
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Third-party applications have become an essential part of today's online ecosystem, enhancing the functionality of popular platforms. However, the intensive data exchange underlying their proliferation has increased concerns about interdependent privacy (IDP). This paper provides a comprehensive investigation into the previously underinvestigated IDP issues of third-party apps. Specifically, first, we analyze the permission structure of multiple app platforms, identifying permissions that have the potential to cause interdependent privacy issues by enabling a user to share someone else's personal data with an app. Second, we collect datasets and characterize the extent to which existing apps request these permissions, revealing the relationship between characteristics such as the respective app platform, the app's type, and the number of interdependent privacy-related permissions it requests. Third, we analyze the various reasons IDP is neglected by both data protection regulations and app platforms and then devise principles that should be followed when designing a mitigation solution. Finally, based on these principles and satisfying clearly defined objectives, we propose IDPFilter, a platform-agnostic API that enables application providers to minimize collateral information collection by filtering out data collected from their users but implicating others as data subjects. We implement a proof-of-concept prototype, IDPTextFilter, that implements the filtering logic on textual data, and provide its initial performance evaluation with regard to privacy, accuracy, and efficiency.
Related papers
- Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - User Interaction Data in Apps: Comparing Policy Claims to
Implementations [0.0]
We analyzed the top 100 apps across diverse categories using static analysis methods to evaluate the alignment between policy claims and implemented data collection techniques.
Our findings highlight the lack of transparency in data collection and the associated risk of re-identification, raising concerns about user privacy and trust.
arXiv Detail & Related papers (2023-12-05T12:11:11Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - TeD-SPAD: Temporal Distinctiveness for Self-supervised
Privacy-preservation for video Anomaly Detection [59.04634695294402]
Video anomaly detection (VAD) without human monitoring is a complex computer vision task.
Privacy leakage in VAD allows models to pick up and amplify unnecessary biases related to people's personal information.
We propose TeD-SPAD, a privacy-aware video anomaly detection framework that destroys visual private information in a self-supervised manner.
arXiv Detail & Related papers (2023-08-21T22:42:55Z) - Tapping into Privacy: A Study of User Preferences and Concerns on
Trigger-Action Platforms [0.0]
The Internet of Things (IoT) devices are rapidly increasing in popularity, with more individuals using Internet-connected devices that continuously monitor their activities.
This work explores privacy concerns and expectations of end-users related to Trigger-Action platforms (TAPs) in the context of the Internet of Things (IoT)
TAPs allow users to customize their smart environments by creating rules that trigger actions based on specific events or conditions.
arXiv Detail & Related papers (2023-08-11T14:25:01Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Just Fine-tune Twice: Selective Differential Privacy for Large Language
Models [69.66654761324702]
We propose a simple yet effective just-fine-tune-twice privacy mechanism to achieve SDP for large Transformer-based language models.
Experiments show that our models achieve strong performance while staying robust to the canary insertion attack.
arXiv Detail & Related papers (2022-04-15T22:36:55Z) - Reasoning over Public and Private Data in Retrieval-Based Systems [29.515915401413334]
State-of-the-art systems explicitly retrieve relevant information to a user question from a background corpus before producing an answer.
While today's retrieval systems assume the corpus is fully accessible, users are often unable or unwilling to expose their private data to entities hosting public data.
We first define the PUBLIC-PRIVATE AUTOREGRESSIVE Information RETRIEVAL (PAIR) privacy framework for the novel retrieval setting over multiple privacy scopes.
arXiv Detail & Related papers (2022-03-14T13:08:51Z) - Private Reinforcement Learning with PAC and Regret Guarantees [69.4202374491817]
We design privacy preserving exploration policies for episodic reinforcement learning (RL)
We first provide a meaningful privacy formulation using the notion of joint differential privacy (JDP)
We then develop a private optimism-based learning algorithm that simultaneously achieves strong PAC and regret bounds, and enjoys a JDP guarantee.
arXiv Detail & Related papers (2020-09-18T20:18:35Z) - Privacy-Aware Time-Series Data Sharing with Deep Reinforcement Learning [33.42328078385098]
We study the privacy-utility trade-off (PUT) in time-series data sharing.
Methods that preserve the privacy for the current time may leak significant amount of information at the trace level.
We consider sharing the distorted version of a user's true data sequence with an untrusted third party.
arXiv Detail & Related papers (2020-03-04T18:47:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.