Is Downloading this App Consistent with my Values? Conceptualizing a
Value-Centered Privacy Assistant
- URL: http://arxiv.org/abs/2106.12458v2
- Date: Wed, 25 Aug 2021 11:26:26 GMT
- Title: Is Downloading this App Consistent with my Values? Conceptualizing a
Value-Centered Privacy Assistant
- Authors: Sarah E. Carter
- Abstract summary: I propose that data privacy decisions can be understood as an expression of user values.
I further propose the creation of a value-centered privacy assistant (VcPA)
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Digital privacy notices aim to provide users with information to make
informed decisions. They are, however, fraught with difficulties. Instead, I
propose that data privacy decisions can be understood as an expression of user
values. To optimize this value expression, I further propose the creation of a
value-centered privacy assistant (VcPA). Here, I preliminary explore how a VcPA
could enhance user value expression by utilizing three user scenarios in the
context of considering whether or not to download an environmental application,
the OpenLitterMap app. These scenarios are conceptually constructed from
established privacy user groups - the privacy fundamentalists; the privacy
pragmatists; and the privacy unconcerned. I conclude that the VcPA best
facilitates user value expression of the privacy fundamentalists. In contrast,
the value expression of the privacy pragmatists and the privacy unconcerned
could be enhanced or hindered depending on the context and their internal
states. Possible implications for optimal VcPA design are also discussed.
Following this initial conceptual exploration of VcPAs, further empirical
research will be required to demonstrate the effectiveness of the VcPA system
in real-world settings.
Related papers
- On the Differential Privacy and Interactivity of Privacy Sandbox Reports [78.21466601986265]
The Privacy Sandbox initiative from Google includes APIs for enabling privacy-preserving advertising functionalities.
We provide a formal model for analyzing the privacy of these APIs and show that they satisfy a formal DP guarantee.
arXiv Detail & Related papers (2024-12-22T08:22:57Z) - Activity Recognition on Avatar-Anonymized Datasets with Masked Differential Privacy [64.32494202656801]
Privacy-preserving computer vision is an important emerging problem in machine learning and artificial intelligence.
We present anonymization pipeline that replaces sensitive human subjects in video datasets with synthetic avatars within context.
We also proposeMaskDP to protect non-anonymized but privacy sensitive background information.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - {A New Hope}: Contextual Privacy Policies for Mobile Applications and An
Approach Toward Automated Generation [19.578130824867596]
The aim of contextual privacy policies ( CPPs) is to fragment privacy policies into concise snippets, displaying them only within the corresponding contexts within the application's graphical user interfaces (GUIs)
In this paper, we first formulate CPP in mobile application scenario, and then present a novel multimodal framework, named SeePrivacy, specifically designed to automatically generate CPPs for mobile applications.
A human evaluation shows that 77% of the extracted privacy policy segments were perceived as well-aligned with the detected contexts.
arXiv Detail & Related papers (2024-02-22T13:32:33Z) - The Privacy-Value-App Relationship and the Value-Centered Privacy
Assistant [3.885316081594592]
We aim to better understand the relationship between our values, our privacy preferences, and our app choices.
We explore the effectiveness of a smartphone value-centered privacy assistant (VcPA) at promoting value-centered app selection.
arXiv Detail & Related papers (2023-08-10T17:04:12Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - A Value-Centered Exploration of Data Privacy and Personalized Privacy
Assistants [0.0]
I suggest instead of utilizing informed consent we could create space for more value-centered user decisions.
I utilize Suzy Killmister's four-dimensional theory of autonomy to operationalize value-centered privacy decisions.
arXiv Detail & Related papers (2022-12-01T14:26:33Z) - Leveraging Privacy Profiles to Empower Users in the Digital Society [7.350403786094707]
Privacy and ethics of citizens are at the core of the concerns raised by our increasingly digital society.
We focus on the privacy dimension and contribute a step in the above direction through an empirical study on an existing dataset collected from the fitness domain.
The results reveal that a compact set of semantic-driven questions helps distinguish users better than a complex domain-dependent one.
arXiv Detail & Related papers (2022-04-01T15:31:50Z) - Fighting the Fog: Evaluating the Clarity of Privacy Disclosures in the
Age of CCPA [29.56312492076473]
Vagueness and ambiguity in privacy policies threaten the ability of consumers to make informed choices about how businesses collect, use, and share personal information.
The California Consumer Privacy Act (CCPA) of 2018 was intended to provide Californian consumers with more control by mandating that businesses clearly disclose their data practices.
Our results suggest that CCPA's mandates for privacy disclosures, as currently implemented, have not yet yielded the level of clarity they were designed to deliver.
arXiv Detail & Related papers (2021-09-28T15:40:57Z) - Private Reinforcement Learning with PAC and Regret Guarantees [69.4202374491817]
We design privacy preserving exploration policies for episodic reinforcement learning (RL)
We first provide a meaningful privacy formulation using the notion of joint differential privacy (JDP)
We then develop a private optimism-based learning algorithm that simultaneously achieves strong PAC and regret bounds, and enjoys a JDP guarantee.
arXiv Detail & Related papers (2020-09-18T20:18:35Z) - PGLP: Customizable and Rigorous Location Privacy through Policy Graph [68.3736286350014]
We propose a new location privacy notion called PGLP, which provides a rich interface to release private locations with customizable and rigorous privacy guarantee.
Specifically, we formalize a user's location privacy requirements using a textitlocation policy graph, which is expressive and customizable.
Third, we design a private location trace release framework that pipelines the detection of location exposure, policy graph repair, and private trajectory release with customizable and rigorous location privacy.
arXiv Detail & Related papers (2020-05-04T04:25:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.