Privacy of Fitness Applications and Consent Management in Blockchain
- URL: http://arxiv.org/abs/2203.00791v1
- Date: Tue, 1 Mar 2022 23:31:47 GMT
- Title: Privacy of Fitness Applications and Consent Management in Blockchain
- Authors: May Alhajri, Ahmad Salehi Shahraki and Carsten Rudolph
- Abstract summary: This paper describes the importance of adopting and applying legal frameworks within the fitness tracker ecosystem.
We identify four main problems related to preserving the privacy of users of fitness apps.
We conclude by describing how blockchain is suitable for solving these privacy issues.
- Score: 0.966840768820136
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The rapid advances in fitness wearable devices are redefining privacy around
interactions. Fitness wearables devices record a considerable amount of
sensitive and private details about exercise, blood oxygen level, and heart
rate. Privacy concerns have emerged about the interactions between an
individual's raw fitness data and data analysis by the providers of fitness
apps and wearable devices. This paper describes the importance of adopting and
applying legal frameworks within the fitness tracker ecosystem. In this review,
we describe the studies on the current privacy policies of fitness app
providers, heuristically evaluate the methods for consent management by fitness
providers, summarize the gaps identified in our review of these studies, and
discuss potential solutions for filling the gaps identified. We have identified
four main problems related to preserving the privacy of users of fitness apps:
lack of system transparency, lack of privacy policy legibility, concerns
regarding one-time consent, and issues of noncompliance regarding consent
management. After discussing feasible solutions, we conclude by describing how
blockchain is suitable for solving these privacy issues.
Related papers
- A Qualitative Analysis Framework for mHealth Privacy Practices [0.0]
This paper introduces a novel framework for the qualitative evaluation of privacy practices in mHealth apps.
Our investigation encompasses an analysis of 152 leading mHealth apps on the Android platform.
Our findings indicate persistent issues with negligence and misuse of sensitive user information.
arXiv Detail & Related papers (2024-05-28T08:57:52Z) - Assessing Mobile Application Privacy: A Quantitative Framework for Privacy Measurement [0.0]
This work aims to contribute to a digital environment that prioritizes privacy, promotes informed decision-making, and endorses the privacy-preserving design principles.
The purpose of this framework is to systematically evaluate the level of privacy risk when using particular Android applications.
arXiv Detail & Related papers (2023-10-31T18:12:19Z) - Privacy-Preserving Joint Edge Association and Power Optimization for the
Internet of Vehicles via Federated Multi-Agent Reinforcement Learning [74.53077322713548]
We investigate the privacy-preserving joint edge association and power allocation problem.
The proposed solution strikes a compelling trade-off, while preserving a higher privacy level than the state-of-the-art solutions.
arXiv Detail & Related papers (2023-01-26T10:09:23Z) - How Do Input Attributes Impact the Privacy Loss in Differential Privacy? [55.492422758737575]
We study the connection between the per-subject norm in DP neural networks and individual privacy loss.
We introduce a novel metric termed the Privacy Loss-Input Susceptibility (PLIS) which allows one to apportion the subject's privacy loss to their input attributes.
arXiv Detail & Related papers (2022-11-18T11:39:03Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - SPAct: Self-supervised Privacy Preservation for Action Recognition [73.79886509500409]
Existing approaches for mitigating privacy leakage in action recognition require privacy labels along with the action labels from the video dataset.
Recent developments of self-supervised learning (SSL) have unleashed the untapped potential of the unlabeled data.
We present a novel training framework which removes privacy information from input video in a self-supervised manner without requiring privacy labels.
arXiv Detail & Related papers (2022-03-29T02:56:40Z) - A Blockchain-Based Consent Mechanism for Access to Fitness Data in the
Healthcare Context [0.966840768820136]
This study introduces an architecture for a human-centric, legally compliant, decentralized and dynamic consent system based on blockchain and smart contracts.
The security properties of the proposed system were evaluated using the formal security modeling framework SeMF.
arXiv Detail & Related papers (2022-02-25T09:51:02Z) - On the Privacy of Mental Health Apps: An Empirical Investigation and its
Implications for Apps Development [14.113922276394588]
This paper reports an empirical study aimed at systematically identifying and understanding data privacy incorporated in mental health apps.
We analyzed 27 top-ranked mental health apps from Google Play Store.
The findings reveal important data privacy issues such as unnecessary permissions, insecure cryptography implementations, and leaks of personal data and credentials in logs and web requests.
arXiv Detail & Related papers (2022-01-22T09:23:56Z) - A Review-based Taxonomy for Secure Health Care Monitoring: Wireless
Smart Cameras [9.4545147165828]
This research focuses on the secure storage of patient and medical records in the healthcare sector.
A potential solution comes from biometrics, although their use may be time-consuming and can slow down data retrieval.
This research aims to overcome these challenges and enhance data access control in the healthcare sector through the addition of biometrics in the form of fingerprints.
arXiv Detail & Related papers (2021-07-05T11:59:10Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z) - BeeTrace: A Unified Platform for Secure Contact Tracing that Breaks Data
Silos [73.84437456144994]
Contact tracing is an important method to control the spread of an infectious disease such as COVID-19.
Current solutions do not utilize the huge volume of data stored in business databases and individual digital devices.
We propose BeeTrace, a unified platform that breaks data silos and deploys state-of-the-art cryptographic protocols to guarantee privacy goals.
arXiv Detail & Related papers (2020-07-05T10:33:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.