Usable, Acceptable, Appropriable: Towards Practicable Privacy
- URL: http://arxiv.org/abs/2004.07359v1
- Date: Wed, 15 Apr 2020 21:39:33 GMT
- Title: Usable, Acceptable, Appropriable: Towards Practicable Privacy
- Authors: Aakash Gautam
- Abstract summary: This paper explores the privacy needs of marginalized and vulnerable populations.
We introduce computers and the Internet to a group of sex-trafficking survivors in Nepal.
We highlight a few socio-political factors that have influenced the design space around digital privacy.
- Score: 2.0305676256390934
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A majority of the work on digital privacy and security has focused on users
from developed countries who account for only around 20\% of the global
population. Moreover, the privacy needs for population that is already
marginalized and vulnerable differ from users who have privilege to access a
greater social support system. We reflect on our experiences of introducing
computers and the Internet to a group of sex-trafficking survivors in Nepal and
highlight a few socio-political factors that have influenced the design space
around digital privacy. These factors include the population's limited digital
and text literacy skills and the fear of stigma against trafficked persons
widely prevalent in Nepali society. We underscore the need to widen our
perspective by focusing on practicable privacy, that is, privacy practices that
are (1) usable, (2) acceptable, and (3) appropriable.
Related papers
- Evaluating the Effects of Digital Privacy Regulations on User Trust [0.0]
The study investigates the impact of digital privacy laws on user trust by comparing regulations in the Netherlands, Ghana, and Malaysia.
The main findings reveal that while the General Protection Regulation in the Netherlands is strict, its practical impact is limited by challenges enforcement.
In Ghana, Data Protection Act is underutilized due to low public awareness and insufficient enforcement, leading to reliance on personal protective measures.
In Malaysia, trust in digital services is largely dependent on the security practices of individual platforms rather than the Personal Data Protection Act.
arXiv Detail & Related papers (2024-09-04T11:11:41Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - The Illusion of Anonymity: Uncovering the Impact of User Actions on Privacy in Web3 Social Ecosystems [11.501563549824466]
We investigate the nuanced dynamics between user engagement on Web3 social platforms and the consequent privacy concerns.
We scrutinize the widespread phenomenon of fabricated activities, which encompasses the establishment of bogus accounts aimed at mimicking popularity.
We highlight the urgent need for more stringent privacy measures and ethical protocols to navigate the complex web of social exchanges.
arXiv Detail & Related papers (2024-05-22T06:26:15Z) - A Comprehensive Picture of Factors Affecting User Willingness to Use
Mobile Health Applications [62.60524178293434]
The aim of this paper is to investigate the factors that influence user acceptance of mHealth apps.
Users' digital literacy has the strongest impact on their willingness to use them, followed by their online habit of sharing personal information.
Users' demographic background, such as their country of residence, age, ethnicity, and education, has a significant moderating effect.
arXiv Detail & Related papers (2023-05-10T08:11:21Z) - Protecting User Privacy in Online Settings via Supervised Learning [69.38374877559423]
We design an intelligent approach to online privacy protection that leverages supervised learning.
By detecting and blocking data collection that might infringe on a user's privacy, we can restore a degree of digital privacy to the user.
arXiv Detail & Related papers (2023-04-06T05:20:16Z) - Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining [75.25943383604266]
We question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving.
We caution that publicizing these models pretrained on Web data as "private" could lead to harm and erode the public's trust in differential privacy as a meaningful definition of privacy.
We conclude by discussing potential paths forward for the field of private learning, as public pretraining becomes more popular and powerful.
arXiv Detail & Related papers (2022-12-13T10:41:12Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - Privacy-Preserving Face Recognition with Learnable Privacy Budgets in
Frequency Domain [77.8858706250075]
This paper proposes a privacy-preserving face recognition method using differential privacy in the frequency domain.
Our method performs very well with several classical face recognition test sets.
arXiv Detail & Related papers (2022-07-15T07:15:36Z) - Digital Divide and Social Dilemma of Privacy Preservation [0.6261444979025642]
"Digital privacy divide (DPD)" is introduced to describe the perceived gap in the privacy preservation of individuals based on the geopolitical location of different countries.
We created an online questionnaire and collected answers from more than 700 respondents from four different countries.
Individuals residing in Germany and Bangladesh share similar privacy concerns, while there is a significant similarity among individuals residing in the United States and India.
arXiv Detail & Related papers (2021-10-06T11:43:46Z) - The Challenges and Impact of Privacy Policy Comprehension [0.0]
This paper experimentally manipulated the privacy-friendliness of an unavoidable and simple privacy policy.
Half of our participants miscomprehended even this transparent privacy policy.
To mitigate such pitfalls we present design recommendations to improve the quality of informed consent.
arXiv Detail & Related papers (2020-05-18T14:16:48Z) - Dis-Empowerment Online: An Investigation of Privacy-Sharing Perceptions
& Method Preferences [6.09170287691728]
We find that perception of privacy empowerment differs from that of sharing across dimensions of meaningfulness, competence and choice.
We find similarities and differences in privacy method preference between the US, UK and Germany.
By mapping the perception of privacy dis-empowerment into patterns of privacy behavior online, this paper provides an important foundation for future research.
arXiv Detail & Related papers (2020-03-19T19:17:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.