Can Virtual Reality Protect Users from Keystroke Inference Attacks?
- URL: http://arxiv.org/abs/2310.16191v1
- Date: Tue, 24 Oct 2023 21:19:38 GMT
- Title: Can Virtual Reality Protect Users from Keystroke Inference Attacks?
- Authors: Zhuolin Yang, Zain Sarwar, Iris Hwang, Ronik Bhaskar, Ben Y. Zhao, Haitao Zheng,
- Abstract summary: We show that despite assumptions of enhanced privacy, VR is unable to shield its users from side-channel attacks that steal private information.
This vulnerability arises from VR's greatest strength, its immersive and interactive nature.
- Score: 23.587497604556823
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Virtual Reality (VR) has gained popularity by providing immersive and interactive experiences without geographical limitations. It also provides a sense of personal privacy through physical separation. In this paper, we show that despite assumptions of enhanced privacy, VR is unable to shield its users from side-channel attacks that steal private information. Ironically, this vulnerability arises from VR's greatest strength, its immersive and interactive nature. We demonstrate this by designing and implementing a new set of keystroke inference attacks in shared virtual environments, where an attacker (VR user) can recover the content typed by another VR user by observing their avatar. While the avatar displays noisy telemetry of the user's hand motion, an intelligent attacker can use that data to recognize typed keys and reconstruct typed content, without knowing the keyboard layout or gathering labeled data. We evaluate the proposed attacks using IRB-approved user studies across multiple VR scenarios. For 13 out of 15 tested users, our attacks accurately recognize 86%-98% of typed keys, and the recovered content retains up to 98% of the meaning of the original typed content. We also discuss potential defenses.
Related papers
- mmSpyVR: Exploiting mmWave Radar for Penetrating Obstacles to Uncover Privacy Vulnerability of Virtual Reality [20.72439781800557]
This paper reveals a novel vulnerability in VR systems that allows attackers to capture VR privacy through obstacles.
We propose mmSpyVR, a novel attack on VR user's privacy via mmWave radar.
arXiv Detail & Related papers (2024-11-15T03:22:44Z) - GAZEploit: Remote Keystroke Inference Attack by Gaze Estimation from Avatar Views in VR/MR Devices [8.206832482042682]
We unveil GAZEploit, a novel eye-tracking based attack specifically designed to exploit these eye-tracking information by leveraging the common use of virtual appearances in VR applications.
Our research, involving 30 participants, achieved over 80% accuracy in keystroke inference.
Our study also identified over 15 top-rated apps in the Apple Store as vulnerable to the GAZEploit attack, emphasizing the urgent need for bolstered security measures for this state-of-the-art VR/MR text entry method.
arXiv Detail & Related papers (2024-09-12T15:11:35Z) - Remote Keylogging Attacks in Multi-user VR Applications [19.79250382329298]
This study highlights a significant security threat in multi-user VR applications.
We propose a remote attack that utilizes the avatar rendering information collected from an adversary's game clients to extract user-typed secrets.
We conducted a user study to verify the attack's effectiveness, in which our attack successfully inferred 97.62% of the keystrokes.
arXiv Detail & Related papers (2024-05-22T22:10:40Z) - Inception Attacks: Immersive Hijacking in Virtual Reality Systems [24.280072806797243]
We introduce the immersive hijacking attack, where a remote attacker takes control of a user's interaction with their VR system.
All of the user's interactions with apps, services and other users can be recorded and modified without their knowledge.
We present our implementation of the immersive hijacking attack on Meta Quest headsets and conduct IRB-approved user studies.
arXiv Detail & Related papers (2024-03-08T23:22:16Z) - An Empirical Study on Oculus Virtual Reality Applications: Security and
Privacy Perspectives [46.995904896724994]
This paper develops a security and privacy assessment tool, namely the VR-SP detector for VR apps.
Using the VR-SP detector, we conduct a comprehensive empirical study on 500 popular VR apps.
We find that a number of security vulnerabilities and privacy leaks widely exist in VR apps.
arXiv Detail & Related papers (2024-02-21T13:53:25Z) - Evaluating Deep Networks for Detecting User Familiarity with VR from
Hand Interactions [7.609875877250929]
We use a VR door as we envision it to the first point of entry to collaborative virtual spaces, such as meeting rooms, offices, or clinics.
While the user may not be familiar with VR, they would be familiar with the task of opening the door.
Using a pilot dataset consisting of 7 users familiar with VR, and 7 not familiar with VR, we acquire highest accuracy of 88.03% when 6 test users, 3 familiar and 3 not familiar, are evaluated with classifiers trained using data from the remaining 8 users.
arXiv Detail & Related papers (2024-01-27T19:15:24Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data [58.27542320038834]
We show that a large number of real VR users can be uniquely and reliably identified across multiple sessions using just their head and hand motion.
After training a classification model on 5 minutes of data per person, a user can be uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.
This work is the first to truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.
arXiv Detail & Related papers (2023-02-17T15:05:18Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Unmasking Communication Partners: A Low-Cost AI Solution for Digitally
Removing Head-Mounted Displays in VR-Based Telepresence [62.997667081978825]
Face-to-face conversation in Virtual Reality (VR) is a challenge when participants wear head-mounted displays (HMD)
Past research has shown that high-fidelity face reconstruction with personal avatars in VR is possible under laboratory conditions with high-cost hardware.
We propose one of the first low-cost systems for this task which uses only open source, free software and affordable hardware.
arXiv Detail & Related papers (2020-11-06T23:17:12Z) - Automatic Recommendation of Strategies for Minimizing Discomfort in
Virtual Environments [58.720142291102135]
In this work, we first present a detailed review about possible causes of Cybersickness (CS)
Our system is able to suggest if the user may be entering in the next moments of the application into an illness situation.
The CSPQ (Cybersickness Profile Questionnaire) is also proposed, which is used to identify the player's susceptibility to CS.
arXiv Detail & Related papers (2020-06-27T19:28:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.