RADS-Checker: Measuring Compliance with Right of Access by the Data Subject in Android Markets
- URL: http://arxiv.org/abs/2410.12463v1
- Date: Wed, 16 Oct 2024 11:23:26 GMT
- Title: RADS-Checker: Measuring Compliance with Right of Access by the Data Subject in Android Markets
- Authors: Zhenhua Li, Zhanpeng Liang, Congcong Yao, Jingyu Hua, Sheng Zhong,
- Abstract summary: The latest data protection regulations worldwide, such as the General Data Protection Regulation (RADS), have established the right to access personal data.
RADS grants users the right to obtain a copy of their personal data from personal data controllers.
There is currently no research systematically examining whether RADS has been effectively implemented in mobile apps.
- Score: 5.598268459947247
- License:
- Abstract: The latest data protection regulations worldwide, such as the General Data Protection Regulation (GDPR), have established the Right of Access by the Data Subject (RADS), granting users the right to access and obtain a copy of their personal data from the data controllers. This clause can effectively compel data controllers to handle user personal data more cautiously, which is of significant importance for protecting user privacy. However, there is currently no research systematically examining whether RADS has been effectively implemented in mobile apps, which are the most common personal data controllers. In this study, we propose a compliance measurement framework for RADS in apps. In our framework, we first analyze an app's privacy policy text using NLP techniques such as GPT-4 to verify whether it clearly declares offering RADS to users and provides specific details on how the right can be exercised. Next, we assess the authenticity and usability of the identified implementation methods by submitting data access requests to the app. Finally, for the obtained data copies, we further verify their completeness by comparing them with the user personal data actually collected by the app during runtime, as captured by Frida Hook. We analyzed a total of 1,631 apps in the American app market G and the Chinese app market H. The results show that less than 54.50% and 37.05% of apps in G and H, respectively, explicitly state in their privacy policies that they can provide users with copies of their personal data. Additionally, in both app markets, less than 20% of apps could truly provide users with their data copies. Finally, among the obtained data copies, only about 2.94% from G pass the completeness verification.
Related papers
- Private, Augmentation-Robust and Task-Agnostic Data Valuation Approach for Data Marketplace [56.78396861508909]
PriArTa is an approach for computing the distance between the distribution of the buyer's existing dataset and the seller's dataset.
PriArTa is communication-efficient, enabling the buyer to evaluate datasets without needing access to the entire dataset from each seller.
arXiv Detail & Related papers (2024-11-01T17:13:14Z) - Are LLM-based methods good enough for detecting unfair terms of service? [67.49487557224415]
Large language models (LLMs) are good at parsing long text-based documents.
We build a dataset consisting of 12 questions applied individually to a set of privacy policies.
Some open-source models are able to provide a higher accuracy compared to some commercial models.
arXiv Detail & Related papers (2024-08-24T09:26:59Z) - How to Drill Into Silos: Creating a Free-to-Use Dataset of Data Subject Access Packages [0.0]
European Union's General Data Protection Regulation strengthened data subjects' right to access personal data.
Subjects' possibilities for actually using controller-provided subject access request packages (SARPs) are severely limited so far.
This dataset is publicly provided and shall, in the future, serve as a starting point for researching and comparing novel approaches for practically viable use of SARPs.
arXiv Detail & Related papers (2024-07-05T12:39:51Z) - PolicyGPT: Automated Analysis of Privacy Policies with Large Language
Models [41.969546784168905]
In practical use, users tend to click the Agree button directly rather than reading them carefully.
This practice exposes users to risks of privacy leakage and legal issues.
Recently, the advent of Large Language Models (LLM) such as ChatGPT and GPT-4 has opened new possibilities for text analysis.
arXiv Detail & Related papers (2023-09-19T01:22:42Z) - Lessons in VCR Repair: Compliance of Android App Developers with the
California Consumer Privacy Act (CCPA) [4.429726534947266]
The California Consumer Privacy Act (CCPA) provides California residents with a range of enhanced privacy protections and rights.
Our research investigated the extent to which Android app developers comply with the provisions of the CCPA.
We compare the actual network traffic of 109 apps that we believe must comply with the CCPA to the data that apps state they collect in their privacy policies.
arXiv Detail & Related papers (2023-04-03T13:02:49Z) - Certified Data Removal in Sum-Product Networks [78.27542864367821]
Deleting the collected data is often insufficient to guarantee data privacy.
UnlearnSPN is an algorithm that removes the influence of single data points from a trained sum-product network.
arXiv Detail & Related papers (2022-10-04T08:22:37Z) - No Free Lunch in "Privacy for Free: How does Dataset Condensation Help
Privacy" [75.98836424725437]
New methods designed to preserve data privacy require careful scrutiny.
Failure to preserve privacy is hard to detect, and yet can lead to catastrophic results when a system implementing a privacy-preserving'' method is attacked.
arXiv Detail & Related papers (2022-09-29T17:50:23Z) - Tracking in apps' privacy policies [3.8073142980733]
We analysed privacy policies from 26,910 mobile apps in May 2019.
52 developers of apps did not provide privacy policy and asked them about data practices.
Despite being legally required to answer such queries, 12 developers (23%) failed to respond.
arXiv Detail & Related papers (2021-11-15T16:03:59Z) - GDPR: When the Right to Access Personal Data Becomes a Threat [63.732639864601914]
We examine more than 300 data controllers performing for each of them a request to access personal data.
We find that 50.4% of the data controllers that handled the request, have flaws in the procedure of identifying the users.
With the undesired and surprising result that, in its present deployment, has actually decreased the privacy of the users of web services.
arXiv Detail & Related papers (2020-05-04T22:01:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.