From "I have nothing to hide" to "It looks like stalking": Measuring Americans' Level of Comfort with Individual Mobility Features Extracted from Location Data
- URL: http://arxiv.org/abs/2502.05686v1
- Date: Sat, 08 Feb 2025 20:34:18 GMT
- Title: From "I have nothing to hide" to "It looks like stalking": Measuring Americans' Level of Comfort with Individual Mobility Features Extracted from Location Data
- Authors: Naman Awasthi, Saad Mohammad Abrar, Daniel Smolyak, Vanessa Frias-Martinez,
- Abstract summary: Location data collection has become widespread with smart phones becoming ubiquitous.
Data aggregators and data brokers offer access to individual location data.
The FTC has also started to vigorously regulate consumer privacy for location data.
- Score: 0.24999074238880484
- License:
- Abstract: Location data collection has become widespread with smart phones becoming ubiquitous. Smart phone apps often collect precise location data from users by offering \textit{free} services and then monetize it for advertising and marketing purposes. While major tech companies only sell aggregate behaviors for marketing purposes; data aggregators and data brokers offer access to individual location data. Some data brokers and aggregators have certain rules in place to preserve privacy; and the FTC has also started to vigorously regulate consumer privacy for location data. In this paper, we present an in-depth exploration of U.S. privacy perceptions with respect to specific location features derivable from data made available by location data brokers and aggregators. These results can provide policy implications that could assist organizations like the FTC in defining clear access rules. Using a factorial vignette survey, we collected responses from 1,405 participants to evaluate their level of comfort with sharing different types of location features, including individual trajectory data and visits to points of interest, available for purchase from data brokers worldwide. Our results show that trajectory-related features are associated with higher privacy concerns, that some data broker based obfuscation practices increase levels of comfort, and that race, ethnicity and education have an effect on data sharing privacy perceptions. We also model the privacy perceptions of people as a predictive task with F1 score \textbf{0.6}.
Related papers
- "I'm not for sale" -- Perceptions and limited awareness of privacy risks by digital natives about location data [0.9668407688201359]
We perform a quantitative and qualitative analysis of smartphone users' awareness, perception and self-reported behavior towards location data-sharing.
About 54% of participants underestimate the number of mobile applications to which they have granted access to their data.
We observe that slightly more than half of participants (57%) are surprised by the extent of potentially inferred information.
arXiv Detail & Related papers (2025-02-17T10:49:23Z) - Investigating Vulnerabilities of GPS Trip Data to Trajectory-User Linking Attacks [49.1574468325115]
We propose a novel attack to reconstruct user identifiers in GPS trip datasets consisting of single trips.
We show that the risk of re-identification is significant even when personal identifiers have been removed.
Further investigations indicate that users who frequently visit locations that are only visited by a small number of others tend to be more vulnerable to re-identification.
arXiv Detail & Related papers (2025-02-12T08:54:49Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - Measuring Privacy Loss in Distributed Spatio-Temporal Data [26.891854386652266]
We propose an alternative privacy loss against location reconstruction attacks by an informed adversary.
Our experiments on real and synthetic data demonstrate that our privacy loss better reflects our intuitions on individual privacy violation in the distributed setting.
arXiv Detail & Related papers (2024-02-18T09:53:14Z) - $\alpha$-Mutual Information: A Tunable Privacy Measure for Privacy
Protection in Data Sharing [4.475091558538915]
This paper adopts Arimoto's $alpha$-Mutual Information as a tunable privacy measure.
We formulate a general distortion-based mechanism that manipulates the original data to offer privacy protection.
arXiv Detail & Related papers (2023-10-27T16:26:14Z) - Where you go is who you are -- A study on machine learning based
semantic privacy attacks [3.259843027596329]
We present a systematic analysis of two attack scenarios, namely location categorization and user profiling.
Experiments on the Foursquare dataset and tracking data demonstrate the potential for abuse of high-quality spatial information.
Our findings point out the risks of ever-growing databases of tracking data and spatial context data.
arXiv Detail & Related papers (2023-10-26T17:56:50Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Lessons from the AdKDD'21 Privacy-Preserving ML Challenge [57.365745458033075]
A prominent proposal at W3C only allows sharing advertising signals through aggregated, differentially private reports of past displays.
To study this proposal extensively, an open Privacy-Preserving Machine Learning Challenge took place at AdKDD'21.
A key finding is that learning models on large, aggregated data in the presence of a small set of unaggregated data points can be surprisingly efficient and cheap.
arXiv Detail & Related papers (2022-01-31T11:09:59Z) - Utility-aware Privacy-preserving Data Releasing [7.462336024223669]
We propose a two-step perturbation-based privacy-preserving data releasing framework.
First, certain predefined privacy and utility problems are learned from the public domain data.
We then leverage the learned knowledge to precisely perturb the data owners' data into privatized data.
arXiv Detail & Related papers (2020-05-09T05:32:46Z) - PGLP: Customizable and Rigorous Location Privacy through Policy Graph [68.3736286350014]
We propose a new location privacy notion called PGLP, which provides a rich interface to release private locations with customizable and rigorous privacy guarantee.
Specifically, we formalize a user's location privacy requirements using a textitlocation policy graph, which is expressive and customizable.
Third, we design a private location trace release framework that pipelines the detection of location exposure, policy graph repair, and private trajectory release with customizable and rigorous location privacy.
arXiv Detail & Related papers (2020-05-04T04:25:59Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.