Inception Attacks: Immersive Hijacking in Virtual Reality Systems
- URL: http://arxiv.org/abs/2403.05721v2
- Date: Mon, 9 Sep 2024 20:03:54 GMT
- Title: Inception Attacks: Immersive Hijacking in Virtual Reality Systems
- Authors: Zhuolin Yang, Cathy Yuanchen Li, Arman Bhalla, Ben Y. Zhao, Haitao Zheng,
- Abstract summary: We introduce the immersive hijacking attack, where a remote attacker takes control of a user's interaction with their VR system.
All of the user's interactions with apps, services and other users can be recorded and modified without their knowledge.
We present our implementation of the immersive hijacking attack on Meta Quest headsets and conduct IRB-approved user studies.
- Score: 24.280072806797243
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Today's virtual reality (VR) systems provide immersive interactions that seamlessly connect users with online services and one another. However, these immersive interfaces also introduce new vulnerabilities, making it easier for users to fall prey to new attacks. In this work, we introduce the immersive hijacking attack, where a remote attacker takes control of a user's interaction with their VR system, by trapping them inside a malicious app that masquerades as the full VR interface. Once trapped, all of the user's interactions with apps, services and other users can be recorded and modified without their knowledge. This not only allows traditional privacy attacks but also introduces new interaction attacks, where two VR users encounter vastly different immersive experiences during their interaction. We present our implementation of the immersive hijacking attack on Meta Quest headsets and conduct IRB-approved user studies that validate its efficacy and stealthiness. Finally, we examine effectiveness and tradeoffs of various potential defenses, and propose a multifaceted defense pipeline.
Related papers
- GAZEploit: Remote Keystroke Inference Attack by Gaze Estimation from Avatar Views in VR/MR Devices [8.206832482042682]
We unveil GAZEploit, a novel eye-tracking based attack specifically designed to exploit these eye-tracking information by leveraging the common use of virtual appearances in VR applications.
Our research, involving 30 participants, achieved over 80% accuracy in keystroke inference.
Our study also identified over 15 top-rated apps in the Apple Store as vulnerable to the GAZEploit attack, emphasizing the urgent need for bolstered security measures for this state-of-the-art VR/MR text entry method.
arXiv Detail & Related papers (2024-09-12T15:11:35Z) - Remote Keylogging Attacks in Multi-user VR Applications [19.79250382329298]
This study highlights a significant security threat in multi-user VR applications.
We propose a remote attack that utilizes the avatar rendering information collected from an adversary's game clients to extract user-typed secrets.
We conducted a user study to verify the attack's effectiveness, in which our attack successfully inferred 97.62% of the keystrokes.
arXiv Detail & Related papers (2024-05-22T22:10:40Z) - Enabling Developers, Protecting Users: Investigating Harassment and
Safety in VR [7.404772554852628]
This study examines users' perceptions of safety control usability and effectiveness as well as the challenges that developers face in designing and deploying VR safety controls.
We identify challenges VR users face while employing safety controls, such as finding users in crowded virtual spaces to block them.
We emphasize the importance of establishing technical and legal guidelines to enhance user safety in virtual environments.
arXiv Detail & Related papers (2024-03-08T18:15:53Z) - An Empirical Study on Oculus Virtual Reality Applications: Security and
Privacy Perspectives [46.995904896724994]
This paper develops a security and privacy assessment tool, namely the VR-SP detector for VR apps.
Using the VR-SP detector, we conduct a comprehensive empirical study on 500 popular VR apps.
We find that a number of security vulnerabilities and privacy leaks widely exist in VR apps.
arXiv Detail & Related papers (2024-02-21T13:53:25Z) - Evaluating Deep Networks for Detecting User Familiarity with VR from
Hand Interactions [7.609875877250929]
We use a VR door as we envision it to the first point of entry to collaborative virtual spaces, such as meeting rooms, offices, or clinics.
While the user may not be familiar with VR, they would be familiar with the task of opening the door.
Using a pilot dataset consisting of 7 users familiar with VR, and 7 not familiar with VR, we acquire highest accuracy of 88.03% when 6 test users, 3 familiar and 3 not familiar, are evaluated with classifiers trained using data from the remaining 8 users.
arXiv Detail & Related papers (2024-01-27T19:15:24Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Can Virtual Reality Protect Users from Keystroke Inference Attacks? [23.587497604556823]
We show that despite assumptions of enhanced privacy, VR is unable to shield its users from side-channel attacks that steal private information.
This vulnerability arises from VR's greatest strength, its immersive and interactive nature.
arXiv Detail & Related papers (2023-10-24T21:19:38Z) - Thinking Two Moves Ahead: Anticipating Other Users Improves Backdoor
Attacks in Federated Learning [102.05872020792603]
We propose an attack that anticipates and accounts for the entire federated learning pipeline, including behaviors of other clients.
We show that this new attack is effective in realistic scenarios where the attacker only contributes to a small fraction of randomly sampled rounds.
arXiv Detail & Related papers (2022-10-17T17:59:38Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Learning Effect of Lay People in Gesture-Based Locomotion in Virtual
Reality [81.5101473684021]
Some of the most promising methods are gesture-based and do not require additional handheld hardware.
Recent work focused mostly on user preference and performance of the different locomotion techniques.
This work is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR.
arXiv Detail & Related papers (2022-06-16T10:44:16Z) - Wireless Edge-Empowered Metaverse: A Learning-Based Incentive Mechanism
for Virtual Reality [102.4151387131726]
We propose a learning-based Incentive Mechanism framework for VR services in the Metaverse.
First, we propose the quality of perception as the metric for VR users in the virtual world.
Second, for quick trading of VR services between VR users (i.e., buyers) and VR SPs (i.e., sellers), we design a double Dutch auction mechanism.
Third, for auction communication reduction, we design a deep reinforcement learning-based auctioneer to accelerate this auction process.
arXiv Detail & Related papers (2021-11-07T13:02:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.