"Money makes the world go around'': Identifying Barriers to Better
Privacy in Children's Apps From Developers' Perspectives
- URL: http://arxiv.org/abs/2111.14596v1
- Date: Mon, 29 Nov 2021 15:27:55 GMT
- Title: "Money makes the world go around'': Identifying Barriers to Better
Privacy in Children's Apps From Developers' Perspectives
- Authors: Anirudh Ekambaranathan and Jun Zhao and Max Van Kleek
- Abstract summary: The industry for children's apps is thriving at the cost of children's privacy.
These apps routinely disclose children's data to multiple data trackers and ad networks.
We used a mixed-methods approach to investigate why this is happening and how developers might change their practices.
- Score: 28.40988446675355
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The industry for children's apps is thriving at the cost of children's
privacy: these apps routinely disclose children's data to multiple data
trackers and ad networks. As children spend increasing time online, such
exposure accumulates to long-term privacy risks. In this paper, we used a
mixed-methods approach to investigate why this is happening and how developers
might change their practices. We base our analysis against 5 leading data
protection frameworks that set out requirements and recommendations for data
collection in children's apps. To understand developers' perspectives and
constraints, we conducted 134 surveys and 20 semi-structured interviews with
popular Android children's app developers. Our analysis revealed that
developers largely respect children's best interests; however, they have to
make compromises due to limited monetisation options, perceived harmlessness of
certain third-party libraries, and lack of availability of design guidelines.
We identified concrete approaches and directions for future research to help
overcome these barriers.
Related papers
- A Large-Scale Privacy Assessment of Android Third-Party SDKs [17.245330733308375]
Third-party Software Development Kits (SDKs) are widely adopted in Android app development.
This convenience raises substantial concerns about unauthorized access to users' privacy-sensitive information.
Our study offers a targeted analysis of user privacy protection among Android third-party SDKs.
arXiv Detail & Related papers (2024-09-16T15:44:43Z) - PrivacyLens: Evaluating Privacy Norm Awareness of Language Models in Action [54.11479432110771]
PrivacyLens is a novel framework designed to extend privacy-sensitive seeds into expressive vignettes and further into agent trajectories.
We instantiate PrivacyLens with a collection of privacy norms grounded in privacy literature and crowdsourced seeds.
State-of-the-art LMs, like GPT-4 and Llama-3-70B, leak sensitive information in 25.68% and 38.69% of cases, even when prompted with privacy-enhancing instructions.
arXiv Detail & Related papers (2024-08-29T17:58:38Z) - The Good and The Bad: Exploring Privacy Issues in Retrieval-Augmented
Generation (RAG) [56.67603627046346]
Retrieval-augmented generation (RAG) is a powerful technique to facilitate language model with proprietary and private data.
In this work, we conduct empirical studies with novel attack methods, which demonstrate the vulnerability of RAG systems on leaking the private retrieval database.
arXiv Detail & Related papers (2024-02-23T18:35:15Z) - User Interaction Data in Apps: Comparing Policy Claims to
Implementations [0.0]
We analyzed the top 100 apps across diverse categories using static analysis methods to evaluate the alignment between policy claims and implemented data collection techniques.
Our findings highlight the lack of transparency in data collection and the associated risk of re-identification, raising concerns about user privacy and trust.
arXiv Detail & Related papers (2023-12-05T12:11:11Z) - Rethinking People Analytics With Inverse Transparency by Design [57.67333075002697]
We propose a new design approach for workforce analytics we refer to as inverse transparency by design.
We find that architectural changes are made without inhibiting core functionality.
We conclude that inverse transparency by design is a promising approach to realize accepted and responsible people analytics.
arXiv Detail & Related papers (2023-05-16T21:37:35Z) - On the conformance of Android applications with children's data
protection regulations and safeguarding guidelines [3.8029070240258687]
Even apps designed for children do not always comply with legislation or guidance.
This lack of compliance could contribute creating a path to causing physical or mental harm.
arXiv Detail & Related papers (2023-05-15T09:46:56Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - On the Privacy of Mental Health Apps: An Empirical Investigation and its
Implications for Apps Development [14.113922276394588]
This paper reports an empirical study aimed at systematically identifying and understanding data privacy incorporated in mental health apps.
We analyzed 27 top-ranked mental health apps from Google Play Store.
The findings reveal important data privacy issues such as unnecessary permissions, insecure cryptography implementations, and leaks of personal data and credentials in logs and web requests.
arXiv Detail & Related papers (2022-01-22T09:23:56Z) - Analysis of Longitudinal Changes in Privacy Behavior of Android
Applications [79.71330613821037]
In this paper, we examine the trends in how Android apps have changed over time with respect to privacy.
We examine the adoption of HTTPS, whether apps scan the device for other installed apps, the use of permissions for privacy-sensitive data, and the use of unique identifiers.
We find that privacy-related behavior has improved with time as apps continue to receive updates, and that the third-party libraries used by apps are responsible for more issues with privacy.
arXiv Detail & Related papers (2021-12-28T16:21:31Z) - SkillBot: Identifying Risky Content for Children in Alexa Skills [4.465104643266321]
Children benefit from the rich functionalities of VPAs but are also exposed to new risks in the VPA ecosystem.
We build a Natural Language Processing-based system to automatically interact with VPA apps.
We identify 28 child-directed apps with risky contents and maintain a growing dataset of 31,966 non-overlapping app behaviors.
arXiv Detail & Related papers (2021-02-05T19:07:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.