Efficient User-Centric Privacy-Friendly and Flexible Wearable Data Aggregation and Sharing
- URL: http://arxiv.org/abs/2203.00465v3
- Date: Sun, 3 Mar 2024 23:31:56 GMT
- Title: Efficient User-Centric Privacy-Friendly and Flexible Wearable Data Aggregation and Sharing
- Authors: Khlood Jastaniah, Ning Zhang, Mustafa A. Mustafa,
- Abstract summary: Wearable devices can offer services to individuals and the public.
Wearable data collected by cloud providers may pose privacy risks.
We propose a novel, efficient, user-centric, privacy-friendly, and flexible data aggregation and sharing scheme, named SAMA.
- Score: 9.532148238768213
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Wearable devices can offer services to individuals and the public. However, wearable data collected by cloud providers may pose privacy risks. To reduce these risks while maintaining full functionality, healthcare systems require solutions for privacy-friendly data processing and sharing that can accommodate three main use cases: (i) data owners requesting processing of their own data, and multiple data requesters requesting data processing of (ii) a single or (iii) multiple data owners. Existing work lacks data owner access control and does not efficiently support these cases, making them unsuitable for wearable devices. To address these limitations, we propose a novel, efficient, user-centric, privacy-friendly, and flexible data aggregation and sharing scheme, named SAMA. SAMA uses a multi-key partial homomorphic encryption scheme to allow flexibility in accommodating the aggregation of data originating from a single or multiple data owners while preserving privacy during the processing. It also uses ciphertext-policy attribute-based encryption scheme to support fine-grain sharing with multiple data requesters based on user-centric access control. Formal security analysis shows that SAMA supports data confidentiality and authorisation. SAMA has also been analysed in terms of computational and communication overheads. Our experimental results demonstrate that SAMA supports privacy-preserving flexible data aggregation more efficiently than the relevant state-of-the-art solutions.
Related papers
- FT-PrivacyScore: Personalized Privacy Scoring Service for Machine Learning Participation [4.772368796656325]
In practice, controlled data access remains a mainstream method for protecting data privacy in many industrial and research environments.
We developed the demo prototype FT-PrivacyScore to show that it's possible to efficiently and quantitatively estimate the privacy risk of participating in a model fine-tuning task.
arXiv Detail & Related papers (2024-10-30T02:41:26Z) - Masked Differential Privacy [64.32494202656801]
We propose an effective approach called masked differential privacy (DP), which allows for controlling sensitive regions where differential privacy is applied.
Our method operates selectively on data and allows for defining non-sensitive-temporal regions without DP application or combining differential privacy with other privacy techniques within data samples.
arXiv Detail & Related papers (2024-10-22T15:22:53Z) - Towards Personal Data Sharing Autonomy:A Task-driven Data Capsule Sharing System [5.076862984714449]
We introduce a novel task-driven personal data sharing system based on the data capsule paradigm realizing personal data sharing autonomy.
Specifically, we present a tamper-resistant data capsule encapsulation method, where the data capsule is the minimal unit for independent and secure personal data storage and sharing.
arXiv Detail & Related papers (2024-09-27T05:13:33Z) - Privacy-Preserving Data Management using Blockchains [0.0]
Data providers need to control and update existing privacy preferences due to changing data usage.
This paper proposes a blockchain-based methodology for preserving data providers private and sensitive data.
arXiv Detail & Related papers (2024-08-21T01:10:39Z) - Robust Utility-Preserving Text Anonymization Based on Large Language Models [80.5266278002083]
Text anonymization is crucial for sharing sensitive data while maintaining privacy.
Existing techniques face the emerging challenges of re-identification attack ability of Large Language Models.
This paper proposes a framework composed of three LLM-based components -- a privacy evaluator, a utility evaluator, and an optimization component.
arXiv Detail & Related papers (2024-07-16T14:28:56Z) - Mind the Privacy Unit! User-Level Differential Privacy for Language Model Fine-Tuning [62.224804688233]
differential privacy (DP) offers a promising solution by ensuring models are 'almost indistinguishable' with or without any particular privacy unit.
We study user-level DP motivated by applications where it necessary to ensure uniform privacy protection across users.
arXiv Detail & Related papers (2024-06-20T13:54:32Z) - S3PHER: Secure and Searchable System for Patient-driven HEalth data shaRing [0.0]
Current systems for sharing health data between patients and caregivers do not fully address the critical security requirements of privacy, confidentiality, and consent management.
We present S3PHER, a novel approach to sharing health data that provides patients with control over who accesses their data, what data is accessed, and when.
arXiv Detail & Related papers (2024-04-17T13:31:50Z) - FewFedPIT: Towards Privacy-preserving and Few-shot Federated Instruction Tuning [54.26614091429253]
Federated instruction tuning (FedIT) is a promising solution, by consolidating collaborative training across multiple data owners.
FedIT encounters limitations such as scarcity of instructional data and risk of exposure to training data extraction attacks.
We propose FewFedPIT, designed to simultaneously enhance privacy protection and model performance of federated few-shot learning.
arXiv Detail & Related papers (2024-03-10T08:41:22Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Reasoning over Public and Private Data in Retrieval-Based Systems [29.515915401413334]
State-of-the-art systems explicitly retrieve relevant information to a user question from a background corpus before producing an answer.
While today's retrieval systems assume the corpus is fully accessible, users are often unable or unwilling to expose their private data to entities hosting public data.
We first define the PUBLIC-PRIVATE AUTOREGRESSIVE Information RETRIEVAL (PAIR) privacy framework for the novel retrieval setting over multiple privacy scopes.
arXiv Detail & Related papers (2022-03-14T13:08:51Z) - BeeTrace: A Unified Platform for Secure Contact Tracing that Breaks Data
Silos [73.84437456144994]
Contact tracing is an important method to control the spread of an infectious disease such as COVID-19.
Current solutions do not utilize the huge volume of data stored in business databases and individual digital devices.
We propose BeeTrace, a unified platform that breaks data silos and deploys state-of-the-art cryptographic protocols to guarantee privacy goals.
arXiv Detail & Related papers (2020-07-05T10:33:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.