Two Types of Data Privacy Controls
- URL: http://arxiv.org/abs/2503.18729v1
- Date: Mon, 24 Mar 2025 14:37:57 GMT
- Title: Two Types of Data Privacy Controls
- Authors: Eman Alashwali,
- Abstract summary: It is not uncommon to hear users say that they feel they have lost control over their data on the web.<n>This article aims to shed light on the often overlooked difference between two main types of privacy from a control perspective.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Users share a vast amount of data while using web and mobile applications. Most service providers such as email and social media providers provide users with privacy controls, which aim to give users the means to control what, how, when, and with whom, users share data. Nevertheless, it is not uncommon to hear users say that they feel they have lost control over their data on the web. This article aims to shed light on the often overlooked difference between two main types of privacy from a control perspective: privacy between a user and other users, and privacy between a user and institutions. We argue why this difference is important and what we need to do from here.
Related papers
- PriveShield: Enhancing User Privacy Using Automatic Isolated Profiles in Browsers [3.9251831157293515]
PriveShield is a light-weight privacy mechanism that disrupts the information gathering cycle.<n>Our evaluation results show that our extension is effective in preventing retargeted ads in 91% of those scenarios.
arXiv Detail & Related papers (2025-01-03T20:29:33Z) - Fingerprinting and Tracing Shadows: The Development and Impact of Browser Fingerprinting on Digital Privacy [55.2480439325792]
Browser fingerprinting is a growing technique for identifying and tracking users online without traditional methods like cookies.
This paper gives an overview by examining the various fingerprinting techniques and analyzes the entropy and uniqueness of the collected data.
arXiv Detail & Related papers (2024-11-18T20:32:31Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - Towards a decentralized data privacy protocol for self-sovereignty in the digital world [0.0]
We propose a paradigm shift towards an enriched user-centric approach for cross-service privacy preferences management.
In this vision paper, we propose the realization of a decentralized data privacy protocol.
arXiv Detail & Related papers (2024-04-19T12:19:04Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining [75.25943383604266]
We question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving.
We caution that publicizing these models pretrained on Web data as "private" could lead to harm and erode the public's trust in differential privacy as a meaningful definition of privacy.
We conclude by discussing potential paths forward for the field of private learning, as public pretraining becomes more popular and powerful.
arXiv Detail & Related papers (2022-12-13T10:41:12Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - "I need a better description'': An Investigation Into User Expectations
For Differential Privacy [31.352325485393074]
We explore users' privacy expectations related to differential privacy.
We find that users care about the kinds of information leaks against which differential privacy protects.
We find that the ways in which differential privacy is described in-the-wild haphazardly set users' privacy expectations.
arXiv Detail & Related papers (2021-10-13T02:36:37Z) - Second layer data governance for permissioned blockchains: the privacy
management challenge [58.720142291102135]
In pandemic situations, such as the COVID-19 and Ebola outbreak, the action related to sharing health data is crucial to avoid the massive infection and decrease the number of deaths.
In this sense, permissioned blockchain technology emerges to empower users to get their rights providing data ownership, transparency, and security through an immutable, unified, and distributed database ruled by smart contracts.
arXiv Detail & Related papers (2020-10-22T13:19:38Z) - The Challenges and Impact of Privacy Policy Comprehension [0.0]
This paper experimentally manipulated the privacy-friendliness of an unavoidable and simple privacy policy.
Half of our participants miscomprehended even this transparent privacy policy.
To mitigate such pitfalls we present design recommendations to improve the quality of informed consent.
arXiv Detail & Related papers (2020-05-18T14:16:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.