BehaVR: User Identification Based on VR Sensor Data
- URL: http://arxiv.org/abs/2308.07304v2
- Date: Mon, 23 Sep 2024 05:13:55 GMT
- Title: BehaVR: User Identification Based on VR Sensor Data
- Authors: Ismat Jarin, Yu Duan, Rahmadi Trimananda, Hao Cui, Salma Elmalaki, Athina Markopoulou,
- Abstract summary: We introduce BehaVR, a framework for collecting and analyzing data from all sensor groups collected by multiple apps running on a VR device.
We use BehaVR to collect data from real users that interact with 20 popular real-world apps.
We build machine learning models for user identification within and across apps, with features extracted from available sensor data.
- Score: 7.114684260471529
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Virtual reality (VR) platforms enable a wide range of applications, however, pose unique privacy risks. In particular, VR devices are equipped with a rich set of sensors that collect personal and sensitive information (e.g., body motion, eye gaze, hand joints, and facial expression). The data from these newly available sensors can be used to uniquely identify a user, even in the absence of explicit identifiers. In this paper, we seek to understand the extent to which a user can be identified based solely on VR sensor data, within and across real-world apps from diverse genres. We consider adversaries with capabilities that range from observing APIs available within a single app (app adversary) to observing all or selected sensor measurements across multiple apps on the VR device (device adversary). To that end, we introduce BehaVR, a framework for collecting and analyzing data from all sensor groups collected by multiple apps running on a VR device. We use BehaVR to collect data from real users that interact with 20 popular real-world apps. We use that data to build machine learning models for user identification within and across apps, with features extracted from available sensor data. We show that these models can identify users with an accuracy of up to 100%, and we reveal the most important features and sensor groups, depending on the functionality of the app and the adversary. To the best of our knowledge, BehaVR is the first to analyze user identification in VR comprehensively, i.e., considering all sensor measurements available on consumer VR devices, collected by multiple real-world, as opposed to custom-made, apps.
Related papers
- An Empirical Study on Oculus Virtual Reality Applications: Security and
Privacy Perspectives [46.995904896724994]
This paper develops a security and privacy assessment tool, namely the VR-SP detector for VR apps.
Using the VR-SP detector, we conduct a comprehensive empirical study on 500 popular VR apps.
We find that a number of security vulnerabilities and privacy leaks widely exist in VR apps.
arXiv Detail & Related papers (2024-02-21T13:53:25Z) - Evaluating Deep Networks for Detecting User Familiarity with VR from
Hand Interactions [7.609875877250929]
We use a VR door as we envision it to the first point of entry to collaborative virtual spaces, such as meeting rooms, offices, or clinics.
While the user may not be familiar with VR, they would be familiar with the task of opening the door.
Using a pilot dataset consisting of 7 users familiar with VR, and 7 not familiar with VR, we acquire highest accuracy of 88.03% when 6 test users, 3 familiar and 3 not familiar, are evaluated with classifiers trained using data from the remaining 8 users.
arXiv Detail & Related papers (2024-01-27T19:15:24Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Multisensory extended reality applications offer benefits for volumetric biomedical image analysis in research and medicine [2.46537907738351]
3D data from high-resolution volumetric imaging is a central resource for diagnosis and treatment in modern medicine.
Recent research used extended reality (XR) for perceiving 3D images with visual depth perception and touch but used restrictive haptic devices.
In this study, 24 experts for biomedical images in research and medicine explored 3D medical shapes with 3 applications.
arXiv Detail & Related papers (2023-11-07T13:37:47Z) - Towards Modeling Software Quality of Virtual Reality Applications from
Users' Perspectives [44.46088489942242]
We conduct the first large-scale empirical study to model the software quality of VR applications from users' perspectives.
We analyze 1,132,056 user reviews of 14,150 VR applications across seven app stores through a semiautomatic review mining approach.
Our analysis reveals that the VR-specific quality attributes are of utmost importance to users, which are closely related to the most unique properties of VR applications.
arXiv Detail & Related papers (2023-08-13T14:42:47Z) - VR.net: A Real-world Dataset for Virtual Reality Motion Sickness
Research [33.092692299254814]
We introduce VR.net', a dataset offering approximately 12-hour gameplay videos from ten real-world games in 10 diverse genres.
For each video frame, a rich set of motion sickness-related labels, such as camera/object movement, depth field, and motion flow, are accurately assigned.
We utilize a tool to automatically and precisely extract ground truth data from 3D engines' rendering pipelines without accessing VR games' source code.
arXiv Detail & Related papers (2023-06-06T03:43:11Z) - Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data [58.27542320038834]
We show that a large number of real VR users can be uniquely and reliably identified across multiple sessions using just their head and hand motion.
After training a classification model on 5 minutes of data per person, a user can be uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.
This work is the first to truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.
arXiv Detail & Related papers (2023-02-17T15:05:18Z) - Learning Effect of Lay People in Gesture-Based Locomotion in Virtual
Reality [81.5101473684021]
Some of the most promising methods are gesture-based and do not require additional handheld hardware.
Recent work focused mostly on user preference and performance of the different locomotion techniques.
This work is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR.
arXiv Detail & Related papers (2022-06-16T10:44:16Z) - SensiX: A Platform for Collaborative Machine Learning on the Edge [69.1412199244903]
We present SensiX, a personal edge platform that stays between sensor data and sensing models.
We demonstrate its efficacy in developing motion and audio-based multi-device sensing systems.
Our evaluation shows that SensiX offers a 7-13% increase in overall accuracy and up to 30% increase across different environment dynamics at the expense of 3mW power overhead.
arXiv Detail & Related papers (2020-12-04T23:06:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.