VR.net: A Real-world Dataset for Virtual Reality Motion Sickness
Research
- URL: http://arxiv.org/abs/2306.03381v1
- Date: Tue, 6 Jun 2023 03:43:11 GMT
- Title: VR.net: A Real-world Dataset for Virtual Reality Motion Sickness
Research
- Authors: Elliott Wen, Chitralekha Gupta, Prasanth Sasikumar, Mark Billinghurst,
James Wilmott, Emily Skow, Arindam Dey, Suranga Nanayakkara
- Abstract summary: We introduce VR.net', a dataset offering approximately 12-hour gameplay videos from ten real-world games in 10 diverse genres.
For each video frame, a rich set of motion sickness-related labels, such as camera/object movement, depth field, and motion flow, are accurately assigned.
We utilize a tool to automatically and precisely extract ground truth data from 3D engines' rendering pipelines without accessing VR games' source code.
- Score: 33.092692299254814
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Researchers have used machine learning approaches to identify motion sickness
in VR experience. These approaches demand an accurately-labeled, real-world,
and diverse dataset for high accuracy and generalizability. As a starting point
to address this need, we introduce `VR.net', a dataset offering approximately
12-hour gameplay videos from ten real-world games in 10 diverse genres. For
each video frame, a rich set of motion sickness-related labels, such as
camera/object movement, depth field, and motion flow, are accurately assigned.
Building such a dataset is challenging since manual labeling would require an
infeasible amount of time. Instead, we utilize a tool to automatically and
precisely extract ground truth data from 3D engines' rendering pipelines
without accessing VR games' source code. We illustrate the utility of VR.net
through several applications, such as risk factor detection and sickness level
prediction. We continuously expand VR.net and envision its next version
offering 10X more data than the current form. We believe that the scale,
accuracy, and diversity of VR.net can offer unparalleled opportunities for VR
motion sickness research and beyond.
Related papers
- Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Unique Identification of 50,000+ Virtual Reality Users from Head & Hand
Motion Data [58.27542320038834]
We show that a large number of real VR users can be uniquely and reliably identified across multiple sessions using just their head and hand motion.
After training a classification model on 5 minutes of data per person, a user can be uniquely identified amongst the entire pool of 50,000+ with 94.33% accuracy from 100 seconds of motion.
This work is the first to truly demonstrate the extent to which biomechanics may serve as a unique identifier in VR, on par with widely used biometrics such as facial or fingerprint recognition.
arXiv Detail & Related papers (2023-02-17T15:05:18Z) - Towards 3D VR-Sketch to 3D Shape Retrieval [128.47604316459905]
We study the use of 3D sketches as an input modality and advocate a VR-scenario where retrieval is conducted.
As a first stab at this new 3D VR-sketch to 3D shape retrieval problem, we make four contributions.
arXiv Detail & Related papers (2022-09-20T22:04:31Z) - Deep Billboards towards Lossless Real2Sim in Virtual Reality [20.7032774699291]
We develop Deep Billboards that model 3D objects implicitly using neural networks.
Our system, connecting a commercial VR headset with a server running neural rendering, allows real-time high-resolution simulation of detailed rigid objects.
We augment Deep Billboards with physical interaction capability, adapting classic billboards from screen-based games to immersive VR.
arXiv Detail & Related papers (2022-08-08T16:16:29Z) - Wireless Edge-Empowered Metaverse: A Learning-Based Incentive Mechanism
for Virtual Reality [102.4151387131726]
We propose a learning-based Incentive Mechanism framework for VR services in the Metaverse.
First, we propose the quality of perception as the metric for VR users in the virtual world.
Second, for quick trading of VR services between VR users (i.e., buyers) and VR SPs (i.e., sellers), we design a double Dutch auction mechanism.
Third, for auction communication reduction, we design a deep reinforcement learning-based auctioneer to accelerate this auction process.
arXiv Detail & Related papers (2021-11-07T13:02:52Z) - Towards a Better Understanding of VR Sickness: Physical Symptom
Prediction for VR Contents [42.71591815197509]
We address the black-box issue of VR sickness assessment (VRSA) by evaluating the level of physical symptoms of VR sickness.
For the VR contents inducing the similar VR sickness level, the physical symptoms can vary depending on the characteristics of the contents.
In this paper, we predict the degrees of main physical symptoms affecting the overall degree of VR sickness, which are disorientation, nausea, and oculomotor.
arXiv Detail & Related papers (2021-04-14T11:09:03Z) - Robust Egocentric Photo-realistic Facial Expression Transfer for Virtual
Reality [68.18446501943585]
Social presence will fuel the next generation of communication systems driven by digital humans in virtual reality (VR)
The best 3D video-realistic VR avatars that minimize the uncanny effect rely on person-specific (PS) models.
This paper makes progress in overcoming these limitations by proposing an end-to-end multi-identity architecture.
arXiv Detail & Related papers (2021-04-10T15:48:53Z) - Guidelines for the Development of Immersive Virtual Reality Software for
Cognitive Neuroscience and Neuropsychology: The Development of Virtual
Reality Everyday Assessment Lab (VR-EAL) [0.0]
This study offers guidelines for the development of VR software in cognitive neuroscience and neuropsychology.
Twenty-five participants aged between 20 and 45 years with 12-16 years of full-time education evaluated various versions of VR-EAL.
The final version of VR-EAL achieved high scores in every sub-score of the VRNQ and exceeded its parsimonious cut-offs.
arXiv Detail & Related papers (2021-01-20T14:55:57Z) - Validation of the Virtual Reality Neuroscience Questionnaire: Maximum
Duration of Immersive Virtual Reality Sessions Without the Presence of
Pertinent Adverse Symptomatology [0.0]
The VRNQ was developed to assess the quality of VR software in terms of user experience, game mechanics, in-game assistance, and VRISE.
The maximum duration of VR sessions should be between 55-70 minutes when the VR software meets or exceeds the parsimonious cut-offs of the VRNQ.
Deeper immersion, better quality of graphics and sound, and more helpful in-game instructions and prompts were found to reduce VRISE intensity.
arXiv Detail & Related papers (2021-01-20T14:10:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.