The Privacy-Value-App Relationship and the Value-Centered Privacy
Assistant
- URL: http://arxiv.org/abs/2308.05700v1
- Date: Thu, 10 Aug 2023 17:04:12 GMT
- Title: The Privacy-Value-App Relationship and the Value-Centered Privacy
Assistant
- Authors: Sarah E. Carter, Mathieu d'Aquin, Dayana Spagnuelo, Ilaria Tiddi,
Kathryn Cormican, Heike Felzmann
- Abstract summary: We aim to better understand the relationship between our values, our privacy preferences, and our app choices.
We explore the effectiveness of a smartphone value-centered privacy assistant (VcPA) at promoting value-centered app selection.
- Score: 3.885316081594592
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Many of us make quick decisions that affect our data privacy on our
smartphones without due consideration of our values. One such decision point is
establishing whether to download a smartphone app or not. In this work, we aim
to better understand the relationship between our values, our privacy
preferences, and our app choices, as well as explore the effectiveness of a
smartphone value-centered privacy assistant (VcPA) at promoting value-centered
app selection. To do this, we conducted a mixed-methods study that involved two
phases. The first was an online survey of 273 smartphone user's values and
privacy preferences when considering whether to download one of two apps (Lose
It! and OpenLitterMap). Our results suggest that values and privacy preferences
are related in an app or context-dependent manner. The second phase was testing
the VcPA with 77 users in a synthetic Mock App Store setting. We established
usability of a VcPA, with the VcPA helping some users more than others with
selecting apps consistent with their selected value profile. Future qualitative
and context-specific explorations of user perspectives could contribute to
adequately capturing the specific role of values for privacy decision-making
and improving the VcPA.
Related papers
- PrivacyLens: Evaluating Privacy Norm Awareness of Language Models in Action [54.11479432110771]
PrivacyLens is a novel framework designed to extend privacy-sensitive seeds into expressive vignettes and further into agent trajectories.
We instantiate PrivacyLens with a collection of privacy norms grounded in privacy literature and crowdsourced seeds.
State-of-the-art LMs, like GPT-4 and Llama-3-70B, leak sensitive information in 25.68% and 38.69% of cases, even when prompted with privacy-enhancing instructions.
arXiv Detail & Related papers (2024-08-29T17:58:38Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - {A New Hope}: Contextual Privacy Policies for Mobile Applications and An
Approach Toward Automated Generation [19.578130824867596]
The aim of contextual privacy policies ( CPPs) is to fragment privacy policies into concise snippets, displaying them only within the corresponding contexts within the application's graphical user interfaces (GUIs)
In this paper, we first formulate CPP in mobile application scenario, and then present a novel multimodal framework, named SeePrivacy, specifically designed to automatically generate CPPs for mobile applications.
A human evaluation shows that 77% of the extracted privacy policy segments were perceived as well-aligned with the detected contexts.
arXiv Detail & Related papers (2024-02-22T13:32:33Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - Mining User Privacy Concern Topics from App Reviews [10.776958968245589]
An increasing number of users are voicing their privacy concerns through app reviews on App stores.
The main challenge of effectively mining privacy concerns from user reviews lies in the fact that reviews expressing privacy concerns are overridden by a large number of reviews expressing more generic themes and noisy content.
In this work, we propose a novel automated approach to overcome that challenge.
arXiv Detail & Related papers (2022-12-19T08:07:27Z) - A Value-Centered Exploration of Data Privacy and Personalized Privacy
Assistants [0.0]
I suggest instead of utilizing informed consent we could create space for more value-centered user decisions.
I utilize Suzy Killmister's four-dimensional theory of autonomy to operationalize value-centered privacy decisions.
arXiv Detail & Related papers (2022-12-01T14:26:33Z) - Leveraging Privacy Profiles to Empower Users in the Digital Society [7.350403786094707]
Privacy and ethics of citizens are at the core of the concerns raised by our increasingly digital society.
We focus on the privacy dimension and contribute a step in the above direction through an empirical study on an existing dataset collected from the fitness domain.
The results reveal that a compact set of semantic-driven questions helps distinguish users better than a complex domain-dependent one.
arXiv Detail & Related papers (2022-04-01T15:31:50Z) - Fully Adaptive Composition in Differential Privacy [53.01656650117495]
Well-known advanced composition theorems allow one to query a private database quadratically more times than basic privacy composition would permit.
We introduce fully adaptive composition, wherein both algorithms and their privacy parameters can be selected adaptively.
We construct filters that match the rates of advanced composition, including constants, despite allowing for adaptively chosen privacy parameters.
arXiv Detail & Related papers (2022-03-10T17:03:12Z) - Analysis of Longitudinal Changes in Privacy Behavior of Android
Applications [79.71330613821037]
In this paper, we examine the trends in how Android apps have changed over time with respect to privacy.
We examine the adoption of HTTPS, whether apps scan the device for other installed apps, the use of permissions for privacy-sensitive data, and the use of unique identifiers.
We find that privacy-related behavior has improved with time as apps continue to receive updates, and that the third-party libraries used by apps are responsible for more issues with privacy.
arXiv Detail & Related papers (2021-12-28T16:21:31Z) - Is Downloading this App Consistent with my Values? Conceptualizing a
Value-Centered Privacy Assistant [0.0]
I propose that data privacy decisions can be understood as an expression of user values.
I further propose the creation of a value-centered privacy assistant (VcPA)
arXiv Detail & Related papers (2021-06-23T15:08:58Z) - Private Reinforcement Learning with PAC and Regret Guarantees [69.4202374491817]
We design privacy preserving exploration policies for episodic reinforcement learning (RL)
We first provide a meaningful privacy formulation using the notion of joint differential privacy (JDP)
We then develop a private optimism-based learning algorithm that simultaneously achieves strong PAC and regret bounds, and enjoys a JDP guarantee.
arXiv Detail & Related papers (2020-09-18T20:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.