DiffAudit: Auditing Privacy Practices of Online Services for Children and Adolescents
- URL: http://arxiv.org/abs/2406.06473v1
- Date: Mon, 10 Jun 2024 17:14:53 GMT
- Title: DiffAudit: Auditing Privacy Practices of Online Services for Children and Adolescents
- Authors: Olivia Figueira, Rahmadi Trimananda, Athina Markopoulou, Scott Jordan,
- Abstract summary: Children's and adolescents' online data privacy are regulated by laws such as the Children's Online Privacy Protection Act (COPPA)
Online services directed towards children, adolescents, and adults must comply with these laws.
We present DiffAudit, a platform-agnostic privacy auditing methodology for general audience services.
- Score: 5.609870736739224
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Children's and adolescents' online data privacy are regulated by laws such as the Children's Online Privacy Protection Act (COPPA) and the California Consumer Privacy Act (CCPA). Online services that are directed towards general audiences (i.e., including children, adolescents, and adults) must comply with these laws. In this paper, first, we present DiffAudit, a platform-agnostic privacy auditing methodology for general audience services. DiffAudit performs differential analysis of network traffic data flows to compare data processing practices (i) between child, adolescent, and adult users and (ii) before and after consent is given and user age is disclosed. We also present a data type classification method that utilizes GPT-4 and our data type ontology based on COPPA and CCPA, allowing us to identify considerably more data types than prior work. Second, we apply DiffAudit to a set of popular general audience mobile and web services and observe a rich set of behaviors extracted from over 440K outgoing requests, containing 3,968 unique data types we extracted and classified. We reveal problematic data processing practices prior to consent and age disclosure, lack of differentiation between age-specific data flows, inconsistent privacy policy disclosures, and sharing of linkable data with third parties, including advertising and tracking services.
Related papers
- Differentially Private Data Release on Graphs: Inefficiencies and Unfairness [48.96399034594329]
This paper characterizes the impact of Differential Privacy on bias and unfairness in the context of releasing information about networks.
We consider a network release problem where the network structure is known to all, but the weights on edges must be released privately.
Our work provides theoretical foundations and empirical evidence into the bias and unfairness arising due to privacy in these networked decision problems.
arXiv Detail & Related papers (2024-08-08T08:37:37Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - IDPFilter: Mitigating Interdependent Privacy Issues in Third-Party Apps [0.30693357740321775]
Third-party apps have increased concerns about interdependent privacy (IDP)
This paper provides a comprehensive investigation into the previously underinvestigated IDP issues of third-party apps.
We propose IDPFilter, a platform-agnostic API that enables application providers to minimize collateral information collection.
arXiv Detail & Related papers (2024-05-02T16:02:13Z) - Using Decentralized Aggregation for Federated Learning with Differential
Privacy [0.32985979395737774]
Federated Learning (FL) provides some level of privacy by retaining the data at the local node.
This research deploys an experimental environment for FL with Differential Privacy (DP) using benchmark datasets.
arXiv Detail & Related papers (2023-11-27T17:02:56Z) - When Graph Convolution Meets Double Attention: Online Privacy Disclosure Detection with Multi-Label Text Classification [6.700420953065072]
It is important to detect such unwanted privacy disclosures to help alert people affected and the online platform.
In this paper, privacy disclosure detection is modeled as a multi-label text classification problem.
A new privacy disclosure detection model is proposed to construct an MLTC classifier for detecting online privacy disclosures.
arXiv Detail & Related papers (2023-11-27T15:25:17Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Having your Privacy Cake and Eating it Too: Platform-supported Auditing
of Social Media Algorithms for Public Interest [70.02478301291264]
Social media platforms curate access to information and opportunities, and so play a critical role in shaping public discourse.
Prior studies have used black-box methods to show that these algorithms can lead to biased or discriminatory outcomes.
We propose a new method for platform-supported auditing that can meet the goals of the proposed legislation.
arXiv Detail & Related papers (2022-07-18T17:32:35Z) - Towards a Data Privacy-Predictive Performance Trade-off [2.580765958706854]
We evaluate the existence of a trade-off between data privacy and predictive performance in classification tasks.
Unlike previous literature, we confirm that the higher the level of privacy, the higher the impact on predictive performance.
arXiv Detail & Related papers (2022-01-13T21:48:51Z) - Location Data and COVID-19 Contact Tracing: How Data Privacy Regulations
and Cell Service Providers Work In Tandem [1.1470070927586016]
Cellphone Service Providers (CSPs) enable the spatial and temporal real-time user tracking.
Three of the regulations analyzed analyzed defined mobile location data as private information.
Two of the five CSPs that were analyzed did not comply with the COPPA regulation.
arXiv Detail & Related papers (2021-03-25T22:13:50Z) - Second layer data governance for permissioned blockchains: the privacy
management challenge [58.720142291102135]
In pandemic situations, such as the COVID-19 and Ebola outbreak, the action related to sharing health data is crucial to avoid the massive infection and decrease the number of deaths.
In this sense, permissioned blockchain technology emerges to empower users to get their rights providing data ownership, transparency, and security through an immutable, unified, and distributed database ruled by smart contracts.
arXiv Detail & Related papers (2020-10-22T13:19:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.