Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data
- URL: http://arxiv.org/abs/2302.08927v1
- Date: Fri, 17 Feb 2023 15:05:18 GMT
- Title: Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data
- Authors: Vivek Nair, Wenbo Guo, Justus Mattern, Rui Wang, James F. O'Brien,
Louis Rosenberg, Dawn Song
- Abstract summary: We show that a large number of real VR users can be uniquely and reliably identified across multiple sessions using just their head and hand motion.
After training a classification model on 5 minutes of data per person, a user can be uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.
This work is the first to truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.
- Score: 58.27542320038834
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the recent explosive growth of interest and investment in virtual
reality (VR) and the so-called "metaverse," public attention has rightly
shifted toward the unique security and privacy threats that these platforms may
pose. While it has long been known that people reveal information about
themselves via their motion, the extent to which this makes an individual
globally identifiable within virtual reality has not yet been widely
understood. In this study, we show that a large number of real VR users
(N=55,541) can be uniquely and reliably identified across multiple sessions
using just their head and hand motion relative to virtual objects. After
training a classification model on 5 minutes of data per person, a user can be
uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy
from 100 seconds of motion, and with 73.20% accuracy from just 10 seconds of
motion. This work is the first to truly demonstrate the extent to which
biomechanics may serve as a unique identifier in VR, on par with widely used
biometrics such as facial or fingerprint recognition.
Related papers
- Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Can Virtual Reality Protect Users from Keystroke Inference Attacks? [23.587497604556823]
We show that despite assumptions of enhanced privacy, VR is unable to shield its users from side-channel attacks that steal private information.
This vulnerability arises from VR's greatest strength, its immersive and interactive nature.
arXiv Detail & Related papers (2023-10-24T21:19:38Z) - BehaVR: User Identification Based on VR Sensor Data [7.114684260471529]
We introduce BehaVR, a framework for collecting and analyzing data from all sensor groups collected by multiple apps running on a VR device.
We use BehaVR to collect data from real users that interact with 20 popular real-world apps.
We build machine learning models for user identification within and across apps, with features extracted from available sensor data.
arXiv Detail & Related papers (2023-08-14T17:43:42Z) - Physics-based Motion Retargeting from Sparse Inputs [73.94570049637717]
Commercial AR/VR products consist only of a headset and controllers, providing very limited sensor data of the user's pose.
We introduce a method to retarget motions in real-time from sparse human sensor data to characters of various morphologies.
We show that the avatar poses often match the user surprisingly well, despite having no sensor information of the lower body available.
arXiv Detail & Related papers (2023-07-04T21:57:05Z) - Eye-tracked Virtual Reality: A Comprehensive Survey on Methods and
Privacy Challenges [33.50215933003216]
This survey focuses on eye tracking in virtual reality (VR) and the privacy implications of those possibilities.
We first cover major works in eye tracking, VR, and privacy areas between the years 2012 and 2022.
We focus on eye-based authentication as well as computational methods to preserve the privacy of individuals and their eye-tracking data in VR.
arXiv Detail & Related papers (2023-05-23T14:02:38Z) - Towards Zero-trust Security for the Metaverse [14.115124942695887]
We develop a holistic research agenda for zero-trust user authentication in social virtual reality (VR)
Our proposed research includes four concrete steps: investigating biometrics-based authentication that is suitable for continuously authenticating VR users, leveraging federated learning for protecting user privacy in biometric data, improving the accuracy of continuous VR authentication with multimodal data, and boosting the usability of zero-trust security with adaptive VR authentication.
arXiv Detail & Related papers (2023-02-17T14:13:02Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - The Gesture Authoring Space: Authoring Customised Hand Gestures for
Grasping Virtual Objects in Immersive Virtual Environments [81.5101473684021]
This work proposes a hand gesture authoring tool for object specific grab gestures allowing virtual objects to be grabbed as in the real world.
The presented solution uses template matching for gesture recognition and requires no technical knowledge to design and create custom tailored hand gestures.
The study showed that gestures created with the proposed approach are perceived by users as a more natural input modality than the others.
arXiv Detail & Related papers (2022-07-03T18:33:33Z) - Robust Egocentric Photo-realistic Facial Expression Transfer for Virtual
Reality [68.18446501943585]
Social presence will fuel the next generation of communication systems driven by digital humans in virtual reality (VR)
The best 3D video-realistic VR avatars that minimize the uncanny effect rely on person-specific (PS) models.
This paper makes progress in overcoming these limitations by proposing an end-to-end multi-identity architecture.
arXiv Detail & Related papers (2021-04-10T15:48:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.