Automatic Recommendation of Strategies for Minimizing Discomfort in
Virtual Environments
- URL: http://arxiv.org/abs/2006.15432v1
- Date: Sat, 27 Jun 2020 19:28:48 GMT
- Title: Automatic Recommendation of Strategies for Minimizing Discomfort in
Virtual Environments
- Authors: Thiago Porcino, Esteban Clua, Daniela Trevisan, \'Erick Rodrigues,
Alexandre Silva
- Abstract summary: In this work, we first present a detailed review about possible causes of Cybersickness (CS)
Our system is able to suggest if the user may be entering in the next moments of the application into an illness situation.
The CSPQ (Cybersickness Profile Questionnaire) is also proposed, which is used to identify the player's susceptibility to CS.
- Score: 58.720142291102135
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Virtual reality (VR) is an imminent trend in games, education, entertainment,
military, and health applications, as the use of head-mounted displays is
becoming accessible to the mass market. Virtual reality provides immersive
experiences but still does not offer an entirely perfect situation, mainly due
to Cybersickness (CS) issues. In this work, we first present a detailed review
about possible causes of CS. Following, we propose a novel CS prediction
solution. Our system is able to suggest if the user may be entering in the next
moments of the application into an illness situation. We use Random Forest
classifiers, based on a dataset we have produced. The CSPQ (Cybersickness
Profile Questionnaire) is also proposed, which is used to identify the player's
susceptibility to CS and the dataset construction. In addition, we designed two
immersive environments for empirical studies where participants are asked to
complete the questionnaire and describe (orally) the degree of discomfort
during their gaming experience. Our data was achieved through 84 individuals on
different days, using VR devices. Our proposal also allows us to identify which
are the most frequent attributes (causes) in the observed discomfort
situations.
Related papers
- Privacy-Preserving Gaze Data Streaming in Immersive Interactive Virtual Reality: Robustness and User Experience [11.130411904676095]
Eye tracking data, if exposed, can be used for re-identification attacks.
We develop a methodology to evaluate real-time privacy mechanisms for interactive VR applications.
arXiv Detail & Related papers (2024-02-12T14:53:12Z) - Deep Motion Masking for Secure, Usable, and Scalable Real-Time Anonymization of Virtual Reality Motion Data [49.68609500290361]
Recent studies have demonstrated that the motion tracking "telemetry" data used by nearly all VR applications is as uniquely identifiable as a fingerprint scan.
We present in this paper a state-of-the-art VR identification model that can convincingly bypass known defensive countermeasures.
arXiv Detail & Related papers (2023-11-09T01:34:22Z) - Toward Optimized VR/AR Ergonomics: Modeling and Predicting User Neck
Muscle Contraction [21.654553113159665]
We measure, model, and predict VR users' neck muscle contraction levels (MCL) while they move their heads to interact with the virtual environment.
We develop a bio-physically inspired computational model to predict neck MCL under diverse head kinematic states.
We hope this research will motivate new ergonomic-centered designs for VR/AR and interactive graphics applications.
arXiv Detail & Related papers (2023-08-28T18:58:01Z) - Force-Aware Interface via Electromyography for Natural VR/AR Interaction [69.1332992637271]
We design a learning-based neural interface for natural and intuitive force inputs in VR/AR.
We show that our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration.
We envision our findings to push forward research towards more realistic physicality in future VR/AR.
arXiv Detail & Related papers (2022-10-03T20:51:25Z) - Fine-Grained VR Sketching: Dataset and Insights [140.0579567561475]
We present the first fine-grained dataset of 1,497 3D VR sketch and 3D shape pairs of a chair category with large shapes diversity.
Our dataset supports the recent trend in the sketch community on fine-grained data analysis.
arXiv Detail & Related papers (2022-09-20T21:30:54Z) - Perceptual Quality Assessment of Virtual Reality Videos in the Wild [53.94620993606658]
Existing panoramic video databases only consider synthetic distortions, assume fixed viewing conditions, and are limited in size.
We construct the VR Video Quality in the Wild (VRVQW) database, containing $502$ user-generated videos with diverse content and distortion characteristics.
We conduct a formal psychophysical experiment to record the scanpaths and perceived quality scores from $139$ participants under two different viewing conditions.
arXiv Detail & Related papers (2022-06-13T02:22:57Z) - Robust Egocentric Photo-realistic Facial Expression Transfer for Virtual
Reality [68.18446501943585]
Social presence will fuel the next generation of communication systems driven by digital humans in virtual reality (VR)
The best 3D video-realistic VR avatars that minimize the uncanny effect rely on person-specific (PS) models.
This paper makes progress in overcoming these limitations by proposing an end-to-end multi-identity architecture.
arXiv Detail & Related papers (2021-04-10T15:48:53Z) - Guidelines for the Development of Immersive Virtual Reality Software for
Cognitive Neuroscience and Neuropsychology: The Development of Virtual
Reality Everyday Assessment Lab (VR-EAL) [0.0]
This study offers guidelines for the development of VR software in cognitive neuroscience and neuropsychology.
Twenty-five participants aged between 20 and 45 years with 12-16 years of full-time education evaluated various versions of VR-EAL.
The final version of VR-EAL achieved high scores in every sub-score of the VRNQ and exceeded its parsimonious cut-offs.
arXiv Detail & Related papers (2021-01-20T14:55:57Z) - Validation of the Virtual Reality Neuroscience Questionnaire: Maximum
Duration of Immersive Virtual Reality Sessions Without the Presence of
Pertinent Adverse Symptomatology [0.0]
The VRNQ was developed to assess the quality of VR software in terms of user experience, game mechanics, in-game assistance, and VRISE.
The maximum duration of VR sessions should be between 55-70 minutes when the VR software meets or exceeds the parsimonious cut-offs of the VRNQ.
Deeper immersion, better quality of graphics and sound, and more helpful in-game instructions and prompts were found to reduce VRISE intensity.
arXiv Detail & Related papers (2021-01-20T14:10:44Z) - Facial Expression Recognition Under Partial Occlusion from Virtual
Reality Headsets based on Transfer Learning [0.0]
convolutional neural network based approaches has become widely adopted due to their proven applicability to Facial Expression Recognition task.
However, recognizing facial expression while wearing a head-mounted VR headset is a challenging task due to the upper half of the face being completely occluded.
We propose a geometric model to simulate occlusion resulting from a Samsung Gear VR headset that can be applied to existing FER datasets.
arXiv Detail & Related papers (2020-08-12T20:25:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.