How Far are App Secrets from Being Stolen? A Case Study on Android
- URL: http://arxiv.org/abs/2501.07805v1
- Date: Tue, 14 Jan 2025 03:15:31 GMT
- Title: How Far are App Secrets from Being Stolen? A Case Study on Android
- Authors: Lili Wei, Heqing Huang, Shing-Chi Cheung, Kevin Li,
- Abstract summary: Android apps can hold secret strings of themselves such as cloud service credentials or encryption keys.
Leakage of such secret strings can induce unprecedented consequences like monetary losses or leakage of user private information.
This study characterizes app secret leakage issues based on 575 potential app secrets sampled from 14,665 popular Android apps on Google Play.
- Score: 9.880229355258875
- License:
- Abstract: Android apps can hold secret strings of themselves such as cloud service credentials or encryption keys. Leakage of such secret strings can induce unprecedented consequences like monetary losses or leakage of user private information. In practice, various security issues were reported because many apps failed to protect their secrets. However, little is known about the types, usages, exploitability, and consequences of app secret leakage issues. While a large body of literature has been devoted to studying user private information leakage, there is no systematic study characterizing app secret leakage issues. How far are Android app secrets from being stolen? To bridge this gap, we conducted the first systematic study to characterize app secret leakage issues in Android apps based on 575 potential app secrets sampled from 14,665 popular Android apps on Google Play. We summarized the common categories of leaked app secrets, assessed their security impacts and disclosed app bad practices in storing app secrets. We devised a text mining strategy using regular expressions and demonstrated that numerous app secrets can be easily stolen, even from the highly popular Android apps on Google. In a follow-up study, we harvested 3,711 distinct exploitable app secrets through automatic analysis. Our findings highlight the prevalence of this problem and call for greater attention to app secret protection.
Related papers
- Automatically Detecting Checked-In Secrets in Android Apps: How Far Are We? [4.619114660081147]
Developers often overlook the proper storage of such secrets, opting to put them directly into their projects.
Checked-in secrets are checked into the projects and can be easily extracted and exploited by malicious adversaries.
Unlike open-source projects, the lack of direct access to the source code and the presence of obfuscation complicates the checked-in secret detection for Android apps.
arXiv Detail & Related papers (2024-12-14T18:14:25Z) - PrivAgent: Agentic-based Red-teaming for LLM Privacy Leakage [78.33839735526769]
LLMs may be fooled into outputting private information under carefully crafted adversarial prompts.
PrivAgent is a novel black-box red-teaming framework for privacy leakage.
arXiv Detail & Related papers (2024-12-07T20:09:01Z) - Towards Precise Detection of Personal Information Leaks in Mobile Health Apps [1.5293427903448022]
Mobile apps ask the user for, and then collect and leak a wealth of Personal Information (PI)
We analyze the PI that apps collect via their user interface, whether the app or third-party code is processing this information, and finally where the data is sent or stored.
We conducted a study on 1,243 Android apps: 623 medical apps and 621 health&fitness apps.
arXiv Detail & Related papers (2024-09-30T23:15:05Z) - The Medium is the Message: How Secure Messaging Apps Leak Sensitive Data to Push Notification Services [9.547428690220618]
This study investigated secure messaging apps' usage of Google's Cloud Messaging (FCM) service to send push notifications to Android devices.
We analyzed 21 popular secure messaging apps from the Google Play Store to determine what personal information these apps leak in the payload of push notifications sent via FCM.
None of the data we observed being leaked to FCM was specifically disclosed in those apps' privacy disclosures.
arXiv Detail & Related papers (2024-07-15T10:13:30Z) - A Risk Estimation Study of Native Code Vulnerabilities in Android Applications [1.6078134198754157]
We propose a fast risk-based approach that provides a risk score related to the native part of an Android application.
We show that many applications contain well-known vulnerabilities that miscreants can potentially exploit.
arXiv Detail & Related papers (2024-06-04T06:44:07Z) - Can LLMs Keep a Secret? Testing Privacy Implications of Language Models via Contextual Integrity Theory [82.7042006247124]
We show that even the most capable AI models reveal private information in contexts that humans would not, 39% and 57% of the time, respectively.
Our work underscores the immediate need to explore novel inference-time privacy-preserving approaches, based on reasoning and theory of mind.
arXiv Detail & Related papers (2023-10-27T04:15:30Z) - The Privacy Onion Effect: Memorization is Relative [76.46529413546725]
We show an Onion Effect of memorization: removing the "layer" of outlier points that are most vulnerable exposes a new layer of previously-safe points to the same attack.
It suggests that privacy-enhancing technologies such as machine unlearning could actually harm the privacy of other users.
arXiv Detail & Related papers (2022-06-21T15:25:56Z) - AirGuard -- Protecting Android Users From Stalking Attacks By Apple Find
My Devices [78.08346367878578]
We reverse engineer Apple's tracking protection in iOS and discuss its features regarding stalking detection.
We design "AirGuard" and release it as an Android app to protect against abuse by Apple tracking devices.
arXiv Detail & Related papers (2022-02-23T22:31:28Z) - Analysis of Longitudinal Changes in Privacy Behavior of Android
Applications [79.71330613821037]
In this paper, we examine the trends in how Android apps have changed over time with respect to privacy.
We examine the adoption of HTTPS, whether apps scan the device for other installed apps, the use of permissions for privacy-sensitive data, and the use of unique identifiers.
We find that privacy-related behavior has improved with time as apps continue to receive updates, and that the third-party libraries used by apps are responsible for more issues with privacy.
arXiv Detail & Related papers (2021-12-28T16:21:31Z) - Mind the GAP: Security & Privacy Risks of Contact Tracing Apps [75.7995398006171]
Google and Apple have jointly provided an API for exposure notification in order to implement decentralized contract tracing apps using Bluetooth Low Energy.
We demonstrate that in real-world scenarios the GAP design is vulnerable to (i) profiling and possibly de-anonymizing persons, and (ii) relay-based wormhole attacks that basically can generate fake contacts.
arXiv Detail & Related papers (2020-06-10T16:05:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.