Personal Data Gentrification
- URL: http://arxiv.org/abs/2103.17109v1
- Date: Wed, 31 Mar 2021 14:26:05 GMT
- Title: Personal Data Gentrification
- Authors: Juan Luis Herrera, Javier Berrocal, Jose Garcia-Alonso, Juan Manuel
Murillo, Hsiao-Yuan Chen, Christine Julien, Niko M\"akitalo, Tommi Mikkonen
- Abstract summary: We live in an era in which the most valued services are not paid for in money, but in personal data.
We propose Personal Data Enfranchisement as a middle ground, empowering individuals to control the sharing of their personal information.
- Score: 5.127089848246933
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We live in an era in which the most valued services are not paid for in
money, but in personal data. Every day, service providers collect the personal
information of billions of individuals, information that sustain their
infrastructure by marketing profiles labeled with this information to personal
data consumers, such as advertisers. Not all uses of this personal data are for
marketing; data consumers can also include, for instance, public health
authorities tracking pandemics. In either case, individuals have undergone a
process of Personal Data Gentrification, as data ownership has shifted from
individuals to service providers and data consumers, as if the data is worth
nothing to the individuals; these new owners then harness the data to obtain
large profits. Current privacy-enhancing technologies are beginning to allow
individuals to control and share less information. However, not sharing
individuals' personal information at all could lead to Personal Data Blight, in
which the potential of personal data in applications that benefit all of
society remains forever latent. In this paper, we propose Personal Data
Enfranchisement as a middle ground, empowering individuals to control the
sharing of their personal information to shift the business flows of personal
information. Based on these insights, we propose a model to gradually and
incrementally make a shift from our current situation towards one of Personal
Data Enfranchisement. Finally, we present a roadmap and some challenges towards
achieving this bold vision.
Related papers
- Towards Personal Data Sharing Autonomy:A Task-driven Data Capsule Sharing System [5.076862984714449]
We introduce a novel task-driven personal data sharing system based on the data capsule paradigm realizing personal data sharing autonomy.
Specifically, we present a tamper-resistant data capsule encapsulation method, where the data capsule is the minimal unit for independent and secure personal data storage and sharing.
arXiv Detail & Related papers (2024-09-27T05:13:33Z) - Enabling Humanitarian Applications with Targeted Differential Privacy [0.39462888523270856]
This paper develops an approach to implementing algorithmic decisions based on personal data.
It provides formal privacy guarantees to data subjects.
We show that stronger privacy guarantees typically come at some cost.
arXiv Detail & Related papers (2024-08-24T01:34:37Z) - Visualising Personal Data Flows: Insights from a Case Study of Booking.com [8.485751288361616]
This paper reports our work on taking Booking.com as a case study to visualise personal data flows extracted from their privacy policy.
By showcasing how the company shares its consumers' personal data, we raise questions and extend discussions on the challenges and limitations of using privacy policies to inform online users about the true scale and the landscape of personal data flows.
arXiv Detail & Related papers (2023-04-19T12:17:46Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining [75.25943383604266]
We question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving.
We caution that publicizing these models pretrained on Web data as "private" could lead to harm and erode the public's trust in differential privacy as a meaningful definition of privacy.
We conclude by discussing potential paths forward for the field of private learning, as public pretraining becomes more popular and powerful.
arXiv Detail & Related papers (2022-12-13T10:41:12Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - Certified Data Removal in Sum-Product Networks [78.27542864367821]
Deleting the collected data is often insufficient to guarantee data privacy.
UnlearnSPN is an algorithm that removes the influence of single data points from a trained sum-product network.
arXiv Detail & Related papers (2022-10-04T08:22:37Z) - Ride Sharing & Data Privacy: An Analysis of the State of Practice [0.0]
We analyzed how popular ride sharing services handle user privacy to assess the current state of practice.
The results show that services include a varying set of personal data and offer limited privacy-related features.
arXiv Detail & Related papers (2021-10-18T11:06:06Z) - A vision for global privacy bridges: Technical and legal measures for
international data markets [77.34726150561087]
Despite data protection laws and an acknowledged right to privacy, trading personal information has become a business equated with "trading oil"
An open conflict is arising between business demands for data and a desire for privacy.
We propose and test a vision of a personal information market with privacy.
arXiv Detail & Related papers (2020-05-13T13:55:50Z) - Utility-aware Privacy-preserving Data Releasing [7.462336024223669]
We propose a two-step perturbation-based privacy-preserving data releasing framework.
First, certain predefined privacy and utility problems are learned from the public domain data.
We then leverage the learned knowledge to precisely perturb the data owners' data into privatized data.
arXiv Detail & Related papers (2020-05-09T05:32:46Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.