Associating eHealth Policies and National Data Privacy Regulations
- URL: http://arxiv.org/abs/2203.04089v1
- Date: Sun, 27 Feb 2022 21:22:48 GMT
- Title: Associating eHealth Policies and National Data Privacy Regulations
- Authors: Saurav K. Aryal, Peter A. Keiller
- Abstract summary: This project aims to evaluate and highlight associations between systems' policies and privacy regulations.
Using bias-corrected Cramer's V and Thiel's U tests we found weak zero associations between e-health systems' rules protections for data and personal privacy.
- Score: 1.713291434132985
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As electronic data becomes the lifeline of modern society, privacy concerns
increase. These concerns are reflected by the European Union's enactment of the
General Data Protection Regulation (GDPR), one of the most comprehensive and
robust privacy regulations globally. This project aims to evaluate and
highlight associations between eHealth systems' policies and personal data
privacy regulations. Using bias-corrected Cramer's V and Thiel's U tests, we
found weak and zero associations between e-health systems' rules and
protections for data privacy. A simple decision tree model is trained, which
validates the association scores obtained
Related papers
- A Qualitative Analysis Framework for mHealth Privacy Practices [0.0]
This paper introduces a novel framework for the qualitative evaluation of privacy practices in mHealth apps.
Our investigation encompasses an analysis of 152 leading mHealth apps on the Android platform.
Our findings indicate persistent issues with negligence and misuse of sensitive user information.
arXiv Detail & Related papers (2024-05-28T08:57:52Z) - Secure Aggregation is Not Private Against Membership Inference Attacks [66.59892736942953]
We investigate the privacy implications of SecAgg in federated learning.
We show that SecAgg offers weak privacy against membership inference attacks even in a single training round.
Our findings underscore the imperative for additional privacy-enhancing mechanisms, such as noise injection.
arXiv Detail & Related papers (2024-03-26T15:07:58Z) - SoK: The Gap Between Data Rights Ideals and Reality [46.14715472341707]
Do rights-based privacy laws effectively empower individuals over their data?
This paper scrutinizes these approaches by reviewing empirical studies, news articles, and blog posts.
arXiv Detail & Related papers (2023-12-03T21:52:51Z) - A Critical Take on Privacy in a Datafied Society [0.0]
I analyze several facets of the lack of online privacy and idiosyncrasies exhibited by privacy advocates.
I discuss of possible effects of datafication on human behavior, the prevalent market-oriented assumption at the base of online privacy, and some emerging adaptation strategies.
A glimpse on the likely problematic future is provided with a discussion on privacy related aspects of EU, UK, and China's proposed generative AI policies.
arXiv Detail & Related papers (2023-08-03T11:45:18Z) - Towards Blockchain-Assisted Privacy-Aware Data Sharing For Edge
Intelligence: A Smart Healthcare Perspective [19.208368632576153]
Linkage attack is a type of dominant attack in the privacy domain.
adversaries launch poisoning attacks to falsify the health data, which leads to misdiagnosing or even physical damage.
To protect private health data, we propose a personalized differential privacy model based on the trust levels among users.
arXiv Detail & Related papers (2023-06-29T02:06:04Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Distributed Machine Learning and the Semblance of Trust [66.1227776348216]
Federated Learning (FL) allows the data owner to maintain data governance and perform model training locally without having to share their data.
FL and related techniques are often described as privacy-preserving.
We explain why this term is not appropriate and outline the risks associated with over-reliance on protocols that were not designed with formal definitions of privacy in mind.
arXiv Detail & Related papers (2021-12-21T08:44:05Z) - Privacy Information Classification: A Hybrid Approach [9.642559585173517]
This study proposes and develops a hybrid privacy classification approach to detect and classify privacy information from OSNs.
The proposed hybrid approach employs both deep learning models and ontology-based models for privacy-related information extraction.
arXiv Detail & Related papers (2021-01-27T18:03:18Z) - Second layer data governance for permissioned blockchains: the privacy
management challenge [58.720142291102135]
In pandemic situations, such as the COVID-19 and Ebola outbreak, the action related to sharing health data is crucial to avoid the massive infection and decrease the number of deaths.
In this sense, permissioned blockchain technology emerges to empower users to get their rights providing data ownership, transparency, and security through an immutable, unified, and distributed database ruled by smart contracts.
arXiv Detail & Related papers (2020-10-22T13:19:38Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.