Automating privacy decisions -- where to draw the line?
- URL: http://arxiv.org/abs/2305.08747v2
- Date: Tue, 16 May 2023 08:58:12 GMT
- Title: Automating privacy decisions -- where to draw the line?
- Authors: Victor Morel and Simone Fischer-H\"ubner
- Abstract summary: Users are often overwhelmed by privacy decisions to manage their personal data.
We provide in this paper an overview of the main challenges raised by the automation of privacy decisions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Users are often overwhelmed by privacy decisions to manage their personal
data, which can happen on the web, in mobile, and in IoT environments. These
decisions can take various forms -- such as decisions for setting privacy
permissions or privacy preferences, decisions responding to consent requests,
or to intervene and ``reject'' processing of one's personal data --, and each
can have different legal impacts. In all cases and for all types of decisions,
scholars and industry have been proposing tools to better automate the process
of privacy decisions at different levels, in order to enhance usability. We
provide in this paper an overview of the main challenges raised by the
automation of privacy decisions, together with a classification scheme of the
existing and envisioned work and proposals addressing automation of privacy
decisions.
Related papers
- Can LLMs Make (Personalized) Access Control Decisions? [2.854451361373021]
We propose to leverage the processing and reasoning capabilities of large language models to make dynamic, context-aware access control decisions.<n>We conducted a user study, which resulted in a dataset of 307 natural-language privacy statements and 14,682 access control decisions made by users.<n>Our results show that in general, LLMs can reflect users' preferences well, achieving up to 86% accuracy when compared to the decision made by the majority of users.
arXiv Detail & Related papers (2025-11-25T13:11:23Z) - On the Differential Privacy and Interactivity of Privacy Sandbox Reports [78.85958224681858]
The Privacy Sandbox initiative from Google includes APIs for enabling privacy-preserving advertising functionalities.
We provide an abstract model for analyzing the privacy of these APIs and show that they satisfy a formal DP guarantee.
arXiv Detail & Related papers (2024-12-22T08:22:57Z) - Differential Privacy Overview and Fundamental Techniques [63.0409690498569]
This chapter is meant to be part of the book "Differential Privacy in Artificial Intelligence: From Theory to Practice"
It starts by illustrating various attempts to protect data privacy, emphasizing where and why they failed.
It then defines the key actors, tasks, and scopes that make up the domain of privacy-preserving data analysis.
arXiv Detail & Related papers (2024-11-07T13:52:11Z) - Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - Assessing Mobile Application Privacy: A Quantitative Framework for Privacy Measurement [0.0]
This work aims to contribute to a digital environment that prioritizes privacy, promotes informed decision-making, and endorses the privacy-preserving design principles.
The purpose of this framework is to systematically evaluate the level of privacy risk when using particular Android applications.
arXiv Detail & Related papers (2023-10-31T18:12:19Z) - Tapping into Privacy: A Study of User Preferences and Concerns on
Trigger-Action Platforms [0.0]
The Internet of Things (IoT) devices are rapidly increasing in popularity, with more individuals using Internet-connected devices that continuously monitor their activities.
This work explores privacy concerns and expectations of end-users related to Trigger-Action platforms (TAPs) in the context of the Internet of Things (IoT)
TAPs allow users to customize their smart environments by creating rules that trigger actions based on specific events or conditions.
arXiv Detail & Related papers (2023-08-11T14:25:01Z) - Advancing Differential Privacy: Where We Are Now and Future Directions for Real-World Deployment [100.1798289103163]
We present a detailed review of current practices and state-of-the-art methodologies in the field of differential privacy (DP)
Key points and high-level contents of the article were originated from the discussions from "Differential Privacy (DP): Challenges Towards the Next Frontier"
This article aims to provide a reference point for the algorithmic and design decisions within the realm of privacy, highlighting important challenges and potential research directions.
arXiv Detail & Related papers (2023-04-14T05:29:18Z) - Leveraging Privacy Profiles to Empower Users in the Digital Society [7.350403786094707]
Privacy and ethics of citizens are at the core of the concerns raised by our increasingly digital society.
We focus on the privacy dimension and contribute a step in the above direction through an empirical study on an existing dataset collected from the fitness domain.
The results reveal that a compact set of semantic-driven questions helps distinguish users better than a complex domain-dependent one.
arXiv Detail & Related papers (2022-04-01T15:31:50Z) - Differential Privacy and Fairness in Decisions and Learning Tasks: A
Survey [50.90773979394264]
It reviews the conditions under which privacy and fairness may have aligned or contrasting goals.
It analyzes how and why DP may exacerbate bias and unfairness in decision problems and learning tasks.
arXiv Detail & Related papers (2022-02-16T16:50:23Z) - Decision Making with Differential Privacy under a Fairness Lens [65.16089054531395]
The U.S. Census Bureau releases data sets and statistics about groups of individuals that are used as input to a number of critical decision processes.
To conform to privacy and confidentiality requirements, these agencies are often required to release privacy-preserving versions of the data.
This paper studies the release of differentially private data sets and analyzes their impact on some critical resource allocation tasks under a fairness perspective.
arXiv Detail & Related papers (2021-05-16T21:04:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.