Assessing Mobile Application Privacy: A Quantitative Framework for Privacy Measurement
- URL: http://arxiv.org/abs/2311.00066v1
- Date: Tue, 31 Oct 2023 18:12:19 GMT
- Title: Assessing Mobile Application Privacy: A Quantitative Framework for Privacy Measurement
- Authors: Joao Marono, Catarina Silva, Joao P. Barraca, Vitor Cunha, Paulo Salvador,
- Abstract summary: This work aims to contribute to a digital environment that prioritizes privacy, promotes informed decision-making, and endorses the privacy-preserving design principles.
The purpose of this framework is to systematically evaluate the level of privacy risk when using particular Android applications.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The proliferation of mobile applications and the subsequent sharing of personal data with service and application providers have given rise to substantial privacy concerns. Application marketplaces have introduced mechanisms to conform to regulations and provide individuals with control over their data. However, a notable absence persists regarding clear indications, labels or scores elucidating the privacy implications of these applications. In response to this challenge, this paper introduces a privacy quantification framework. The purpose of this framework is to systematically evaluate the level of privacy risk when using particular Android applications. The main goal is to provide individuals with qualitative labels to make informed decisions about their privacy. This work aims to contribute to a digital environment that prioritizes privacy, promotes informed decision-making, and endorses the privacy-preserving design principles incorporation.
Related papers
- Interactive GDPR-Compliant Privacy Policy Generation for Software Applications [6.189770781546807]
To use software applications users are sometimes requested to provide their personal information.
As privacy has become a significant concern many protection regulations exist worldwide.
We propose an approach that generates comprehensive and compliant privacy policy.
arXiv Detail & Related papers (2024-10-04T01:22:16Z) - How Privacy-Savvy Are Large Language Models? A Case Study on Compliance and Privacy Technical Review [15.15468770348023]
We evaluate large language models' performance in privacy-related tasks such as privacy information extraction (PIE), legal and regulatory key point detection (KPD), and question answering (QA)
Through an empirical assessment, we investigate the capacity of several prominent LLMs, including BERT, GPT-3.5, GPT-4, and custom models, in executing privacy compliance checks and technical privacy reviews.
While LLMs show promise in automating privacy reviews and identifying regulatory discrepancies, significant gaps persist in their ability to fully comply with evolving legal standards.
arXiv Detail & Related papers (2024-09-04T01:51:37Z) - PrivacyLens: Evaluating Privacy Norm Awareness of Language Models in Action [54.11479432110771]
PrivacyLens is a novel framework designed to extend privacy-sensitive seeds into expressive vignettes and further into agent trajectories.
We instantiate PrivacyLens with a collection of privacy norms grounded in privacy literature and crowdsourced seeds.
State-of-the-art LMs, like GPT-4 and Llama-3-70B, leak sensitive information in 25.68% and 38.69% of cases, even when prompted with privacy-enhancing instructions.
arXiv Detail & Related papers (2024-08-29T17:58:38Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - A Qualitative Analysis Framework for mHealth Privacy Practices [0.0]
This paper introduces a novel framework for the qualitative evaluation of privacy practices in mHealth apps.
Our investigation encompasses an analysis of 152 leading mHealth apps on the Android platform.
Our findings indicate persistent issues with negligence and misuse of sensitive user information.
arXiv Detail & Related papers (2024-05-28T08:57:52Z) - PrivComp-KG : Leveraging Knowledge Graph and Large Language Models for Privacy Policy Compliance Verification [0.0]
We propose a Large Language Model (LLM) and Semantic Web based approach for privacy compliance.
PrivComp-KG is designed to efficiently store and retrieve comprehensive information concerning privacy policies.
It can be queried to check for compliance with privacy policies by each vendor against relevant policy regulations.
arXiv Detail & Related papers (2024-04-30T17:44:44Z) - Advancing Differential Privacy: Where We Are Now and Future Directions for Real-World Deployment [100.1798289103163]
We present a detailed review of current practices and state-of-the-art methodologies in the field of differential privacy (DP)
Key points and high-level contents of the article were originated from the discussions from "Differential Privacy (DP): Challenges Towards the Next Frontier"
This article aims to provide a reference point for the algorithmic and design decisions within the realm of privacy, highlighting important challenges and potential research directions.
arXiv Detail & Related papers (2023-04-14T05:29:18Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - The Evolving Path of "the Right to Be Left Alone" - When Privacy Meets
Technology [0.0]
This paper proposes a novel vision of the privacy ecosystem, introducing privacy dimensions, the related users' expectations, the privacy violations, and the changing factors.
We believe that promising approaches to tackle the privacy challenges move in two directions: (i) identification of effective privacy metrics; and (ii) adoption of formal tools to design privacy-compliant applications.
arXiv Detail & Related papers (2021-11-24T11:27:55Z) - Decision Making with Differential Privacy under a Fairness Lens [65.16089054531395]
The U.S. Census Bureau releases data sets and statistics about groups of individuals that are used as input to a number of critical decision processes.
To conform to privacy and confidentiality requirements, these agencies are often required to release privacy-preserving versions of the data.
This paper studies the release of differentially private data sets and analyzes their impact on some critical resource allocation tasks under a fairness perspective.
arXiv Detail & Related papers (2021-05-16T21:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.