Dis-Empowerment Online: An Investigation of Privacy-Sharing Perceptions
& Method Preferences
- URL: http://arxiv.org/abs/2003.08990v1
- Date: Thu, 19 Mar 2020 19:17:55 GMT
- Title: Dis-Empowerment Online: An Investigation of Privacy-Sharing Perceptions
& Method Preferences
- Authors: Kovila P.L. Coopamootoo
- Abstract summary: We find that perception of privacy empowerment differs from that of sharing across dimensions of meaningfulness, competence and choice.
We find similarities and differences in privacy method preference between the US, UK and Germany.
By mapping the perception of privacy dis-empowerment into patterns of privacy behavior online, this paper provides an important foundation for future research.
- Score: 6.09170287691728
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While it is often claimed that users are empowered via online technologies,
there is also a general feeling of privacy dis-empowerment. We investigate the
perception of privacy and sharing empowerment online, as well as the use of
privacy technologies, via a cross-national online study with N=907
participants. We find that perception of privacy empowerment differs from that
of sharing across dimensions of meaningfulness, competence and choice. We find
similarities and differences in privacy method preference between the US, UK
and Germany. We also find that non-technology methods of privacy protection are
among the most preferred methods, while more advanced and standalone privacy
technologies are least preferred.. By mapping the perception of privacy
dis-empowerment into patterns of privacy behavior online, and clarifying the
similarities and distinctions in privacy technology use, this paper provides an
important foundation for future research and the design of privacy
technologies. The findings may be used across disciplines to develop more
user-centric privacy technologies, that support and enable the user.
Related papers
- Differential Privacy Overview and Fundamental Techniques [63.0409690498569]
This chapter is meant to be part of the book "Differential Privacy in Artificial Intelligence: From Theory to Practice"
It starts by illustrating various attempts to protect data privacy, emphasizing where and why they failed.
It then defines the key actors, tasks, and scopes that make up the domain of privacy-preserving data analysis.
arXiv Detail & Related papers (2024-11-07T13:52:11Z) - Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Privacy Checklist: Privacy Violation Detection Grounding on Contextual Integrity Theory [43.12744258781724]
We formulate the privacy issue as a reasoning problem rather than simple pattern matching.
We develop the first comprehensive checklist that covers social identities, private attributes, and existing privacy regulations.
arXiv Detail & Related papers (2024-08-19T14:48:04Z) - Models Matter: Setting Accurate Privacy Expectations for Local and Central Differential Privacy [14.40391109414476]
We design and evaluate new explanations of differential privacy for the local and central models.
We find that consequences-focused explanations in the style of privacy nutrition labels are a promising approach for setting accurate privacy expectations.
arXiv Detail & Related papers (2024-08-16T01:21:57Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - Algorithms with More Granular Differential Privacy Guarantees [65.3684804101664]
We consider partial differential privacy (DP), which allows quantifying the privacy guarantee on a per-attribute basis.
In this work, we study several basic data analysis and learning tasks, and design algorithms whose per-attribute privacy parameter is smaller that the best possible privacy parameter for the entire record of a person.
arXiv Detail & Related papers (2022-09-08T22:43:50Z) - The Evolving Path of "the Right to Be Left Alone" - When Privacy Meets
Technology [0.0]
This paper proposes a novel vision of the privacy ecosystem, introducing privacy dimensions, the related users' expectations, the privacy violations, and the changing factors.
We believe that promising approaches to tackle the privacy challenges move in two directions: (i) identification of effective privacy metrics; and (ii) adoption of formal tools to design privacy-compliant applications.
arXiv Detail & Related papers (2021-11-24T11:27:55Z) - Equity and Privacy: More Than Just a Tradeoff [10.545898004301323]
Recent work has shown that privacy preserving data publishing can introduce different levels of utility across different population groups.
Will marginal populations see disproportionately less utility from privacy technology?
If there is an inequity how can we address it?
arXiv Detail & Related papers (2021-11-08T17:39:32Z) - User Perception of Privacy with Ubiquitous Devices [5.33024001730262]
This study aims to explore and discover various concerns related to perception of privacy in this era of ubiquitous technologies.
Key themes like attitude towards privacy in public and private spaces, privacy awareness, consent seeking, dilemmas/confusions related to various technologies, impact of attitude and beliefs on individuals actions regarding how to protect oneself from invasion of privacy in both public and private spaces.
arXiv Detail & Related papers (2021-07-23T05:01:44Z) - Private Reinforcement Learning with PAC and Regret Guarantees [69.4202374491817]
We design privacy preserving exploration policies for episodic reinforcement learning (RL)
We first provide a meaningful privacy formulation using the notion of joint differential privacy (JDP)
We then develop a private optimism-based learning algorithm that simultaneously achieves strong PAC and regret bounds, and enjoys a JDP guarantee.
arXiv Detail & Related papers (2020-09-18T20:18:35Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.