What is in Your App? Uncovering Privacy Risks of Female Health Applications
- URL: http://arxiv.org/abs/2310.14490v1
- Date: Mon, 23 Oct 2023 01:46:29 GMT
- Title: What is in Your App? Uncovering Privacy Risks of Female Health Applications
- Authors: Muhammad Hassan, Mahnoor Jameel, Tian Wang, Masooda Bashir,
- Abstract summary: FemTech or Female Technology, is an expanding field dedicated to providing affordable and accessible healthcare solutions for women.
With the leading app exceeding 1 billion downloads, these applications are gaining widespread popularity.
This exploratory study delves into the privacy risks associated with seven popular applications.
- Score: 4.387660388540319
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: FemTech or Female Technology, is an expanding field dedicated to providing affordable and accessible healthcare solutions for women, prominently through Female Health Applications that monitor health and reproductive data. With the leading app exceeding 1 billion downloads, these applications are gaining widespread popularity. However, amidst contemporary challenges to women's reproductive rights and privacy, there is a noticeable lack of comprehensive studies on the security and privacy aspects of these applications. This exploratory study delves into the privacy risks associated with seven popular applications. Our initial quantitative static analysis reveals varied and potentially risky permissions and numerous third-party trackers. Additionally, a preliminary examination of privacy policies indicates non-compliance with fundamental data privacy principles. These early findings highlight a critical gap in establishing robust privacy and security safeguards for FemTech apps, especially significant in a climate where women's reproductive rights face escalating threats.
Related papers
- Evaluating Privacy Measures in Healthcare Apps Predominantly Used by Older Adults [2.7039386580759666]
rapid growth has also heightened concerns about the privacy of their health information.
We evaluated 28 healthcare apps across multiple dimensions, including regulatory compliance, data handling practices, and privacy-focused usability.
Our analysis revealed significant gaps in compliance with privacy standards to such, only 25% of apps explicitly state compliance with HIPAA, and only 18% mention.
Surprisingly, 79% of these applications lack breach protocols, putting older adults at risk in the event of a data breach.
arXiv Detail & Related papers (2024-10-18T17:01:14Z) - Privacy Risks of General-Purpose AI Systems: A Foundation for Investigating Practitioner Perspectives [47.17703009473386]
Powerful AI models have led to impressive leaps in performance across a wide range of tasks.
Privacy concerns have led to a wealth of literature covering various privacy risks and vulnerabilities of AI models.
We conduct a systematic review of these survey papers to provide a concise and usable overview of privacy risks in GPAIS.
arXiv Detail & Related papers (2024-07-02T07:49:48Z) - Privacy and Security of Women's Reproductive Health Apps in a Changing Legal Landscape [1.7930036479971307]
Privacy and security vulnerabilities in period-tracking and fertility-monitoring apps present significant risks.
Our approach involves manual observations of privacy policies and app permissions, along with dynamic and static analysis.
Our analysis identifies that 61% of the code vulnerabilities found in the apps are classified under the top-ten Open Web Application Security Project (OWASP) vulnerabilities.
arXiv Detail & Related papers (2024-04-08T21:19:10Z) - Privacy-preserving Optics for Enhancing Protection in Face De-identification [60.110274007388135]
We propose a hardware-level face de-identification method to solve this vulnerability.
We also propose an anonymization framework that generates a new face using the privacy-preserving image, face heatmap, and a reference face image from a public dataset as input.
arXiv Detail & Related papers (2024-03-31T19:28:04Z) - A Unified View of Differentially Private Deep Generative Modeling [60.72161965018005]
Data with privacy concerns comes with stringent regulations that frequently prohibited data access and data sharing.
Overcoming these obstacles is key for technological progress in many real-world application scenarios that involve privacy sensitive data.
Differentially private (DP) data publishing provides a compelling solution, where only a sanitized form of the data is publicly released.
arXiv Detail & Related papers (2023-09-27T14:38:16Z) - A Survey on Privacy in Graph Neural Networks: Attacks, Preservation, and
Applications [76.88662943995641]
Graph Neural Networks (GNNs) have gained significant attention owing to their ability to handle graph-structured data.
To address this issue, researchers have started to develop privacy-preserving GNNs.
Despite this progress, there is a lack of a comprehensive overview of the attacks and the techniques for preserving privacy in the graph domain.
arXiv Detail & Related papers (2023-08-31T00:31:08Z) - You Are How You Walk: Quantifying Privacy Risks in Step Count Data [17.157398766367265]
We perform the first systematic study on quantifying privacy risks stemming from step count data.
We propose two attacks including attribute inference for gender, age and education and temporal linkability.
We believe our results can serve as a step stone for deriving a privacy-preserving ecosystem for wearable devices in the future.
arXiv Detail & Related papers (2023-08-09T13:06:13Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - On the Privacy of Mental Health Apps: An Empirical Investigation and its
Implications for Apps Development [14.113922276394588]
This paper reports an empirical study aimed at systematically identifying and understanding data privacy incorporated in mental health apps.
We analyzed 27 top-ranked mental health apps from Google Play Store.
The findings reveal important data privacy issues such as unnecessary permissions, insecure cryptography implementations, and leaks of personal data and credentials in logs and web requests.
arXiv Detail & Related papers (2022-01-22T09:23:56Z) - Analysis of Longitudinal Changes in Privacy Behavior of Android
Applications [79.71330613821037]
In this paper, we examine the trends in how Android apps have changed over time with respect to privacy.
We examine the adoption of HTTPS, whether apps scan the device for other installed apps, the use of permissions for privacy-sensitive data, and the use of unique identifiers.
We find that privacy-related behavior has improved with time as apps continue to receive updates, and that the third-party libraries used by apps are responsible for more issues with privacy.
arXiv Detail & Related papers (2021-12-28T16:21:31Z) - User Perception of Privacy with Ubiquitous Devices [5.33024001730262]
This study aims to explore and discover various concerns related to perception of privacy in this era of ubiquitous technologies.
Key themes like attitude towards privacy in public and private spaces, privacy awareness, consent seeking, dilemmas/confusions related to various technologies, impact of attitude and beliefs on individuals actions regarding how to protect oneself from invasion of privacy in both public and private spaces.
arXiv Detail & Related papers (2021-07-23T05:01:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.