Moving Avatars and Agents in Social Extended Reality Environments
- URL: http://arxiv.org/abs/2306.14484v1
- Date: Mon, 26 Jun 2023 07:51:17 GMT
- Title: Moving Avatars and Agents in Social Extended Reality Environments
- Authors: Jann Philipp Freiwald, Susanne Schmidt, Bernhard E. Riecke, Frank
Steinicke
- Abstract summary: We introduce a Smart Avatar system that delivers continuous full-body human representations for noncontinuous locomotion in VR spaces.
We also introduce the concept of Stuttered Locomotion, which can be applied to any continuous locomotion method.
We will discuss the potential of Smart Avatars and Stuttered Locomotion for shared VR experiences.
- Score: 16.094148092964264
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Natural interaction between multiple users within a shared virtual
environment (VE) relies on each other's awareness of the current position of
the interaction partners. This, however, cannot be warranted when users employ
noncontinuous locomotion techniques, such as teleportation, which may cause
confusion among bystanders. In this paper, we pursue two approaches to create a
pleasant experience for both the moving user and the bystanders observing that
movement. First, we will introduce a Smart Avatar system that delivers
continuous full-body human representations for noncontinuous locomotion in
shared virtual reality (VR) spaces. Smart Avatars imitate their assigned user's
real-world movements when close-by and autonomously navigate to their user when
the distance between them exceeds a certain threshold, i.e., after the user
teleports. As part of the Smart Avatar system, we implemented four avatar
transition techniques and compared them to conventional avatar locomotion in a
user study, revealing significant positive effects on the observer's spatial
awareness, as well as pragmatic and hedonic quality scores. Second, we
introduce the concept of Stuttered Locomotion, which can be applied to any
continuous locomotion method. By converting a continuous movement into
short-interval teleport steps, we provide the merits of non-continuous
locomotion for the moving user while observers can easily keep track of their
path. Thus, while the experience for observers is similarly positive as with
continuous motion, a user study confirmed that Stuttered Locomotion can
significantly reduce the occurrence of cybersickness symptoms for the moving
user, making it an attractive choice for shared VEs. We will discuss the
potential of Smart Avatars and Stuttered Locomotion for shared VR experiences,
both when applied individually and in combination.
Related papers
- Tremor Reduction for Accessible Ray Based Interaction in VR Applications [0.0]
Many traditional 2D interface interaction methods have been directly converted to work in a VR space with little alteration to the input mechanism.
In this paper we propose the use of a low pass filter, to normalize user input noise, alleviating fine motor requirements during ray-based interaction.
arXiv Detail & Related papers (2024-05-12T17:07:16Z) - ShareYourReality: Investigating Haptic Feedback and Agency in Virtual
Avatar Co-embodiment [10.932344446402276]
Virtual co-embodiment enables two users to share a single avatar in Virtual Reality (VR)
During such experiences, the illusion of shared motion control can break during joint-action activities.
We explore how haptics can enable non-verbal coordination between co-embodied participants.
arXiv Detail & Related papers (2024-03-13T09:23:53Z) - Universal Humanoid Motion Representations for Physics-Based Control [71.46142106079292]
We present a universal motion representation that encompasses a comprehensive range of motor skills for physics-based humanoid control.
We first learn a motion imitator that can imitate all of human motion from a large, unstructured motion dataset.
We then create our motion representation by distilling skills directly from the imitator.
arXiv Detail & Related papers (2023-10-06T20:48:43Z) - Physics-based Motion Retargeting from Sparse Inputs [73.94570049637717]
Commercial AR/VR products consist only of a headset and controllers, providing very limited sensor data of the user's pose.
We introduce a method to retarget motions in real-time from sparse human sensor data to characters of various morphologies.
We show that the avatar poses often match the user surprisingly well, despite having no sensor information of the lower body available.
arXiv Detail & Related papers (2023-07-04T21:57:05Z) - HOOV: Hand Out-Of-View Tracking for Proprioceptive Interaction using
Inertial Sensing [25.34222794274071]
We present HOOV, a wrist-worn sensing method that allows VR users to interact with objects outside their field of view.
Based on the signals of a single wrist-worn inertial sensor, HOOV continuously estimates the user's hand position in 3-space.
Our novel data-driven method predicts hand positions and trajectories from just the continuous estimation of hand orientation.
arXiv Detail & Related papers (2023-03-13T11:25:32Z) - QuestSim: Human Motion Tracking from Sparse Sensors with Simulated
Avatars [80.05743236282564]
Real-time tracking of human body motion is crucial for immersive experiences in AR/VR.
We present a reinforcement learning framework that takes in sparse signals from an HMD and two controllers.
We show that a single policy can be robust to diverse locomotion styles, different body sizes, and novel environments.
arXiv Detail & Related papers (2022-09-20T00:25:54Z) - AvatarPoser: Articulated Full-Body Pose Tracking from Sparse Motion
Sensing [24.053096294334694]
We present AvatarPoser, the first learning-based method that predicts full-body poses in world coordinates using only motion input from the user's head and hands.
Our method builds on a Transformer encoder to extract deep features from the input signals and decouples global motion from the learned local joint orientations.
In our evaluation, AvatarPoser achieved new state-of-the-art results in evaluations on large motion capture datasets.
arXiv Detail & Related papers (2022-07-27T20:52:39Z) - Learning Effect of Lay People in Gesture-Based Locomotion in Virtual
Reality [81.5101473684021]
Some of the most promising methods are gesture-based and do not require additional handheld hardware.
Recent work focused mostly on user preference and performance of the different locomotion techniques.
This work is investigated whether and how quickly users can adapt to a hand gesture-based locomotion system in VR.
arXiv Detail & Related papers (2022-06-16T10:44:16Z) - Learning Perceptual Locomotion on Uneven Terrains using Sparse Visual
Observations [75.60524561611008]
This work aims to exploit the use of sparse visual observations to achieve perceptual locomotion over a range of commonly seen bumps, ramps, and stairs in human-centred environments.
We first formulate the selection of minimal visual input that can represent the uneven surfaces of interest, and propose a learning framework that integrates such exteroceptive and proprioceptive data.
We validate the learned policy in tasks that require omnidirectional walking over flat ground and forward locomotion over terrains with obstacles, showing a high success rate.
arXiv Detail & Related papers (2021-09-28T20:25:10Z) - Learning Quadrupedal Locomotion over Challenging Terrain [68.51539602703662]
Legged locomotion can dramatically expand the operational domains of robotics.
Conventional controllers for legged locomotion are based on elaborate state machines that explicitly trigger the execution of motion primitives and reflexes.
Here we present a radically robust controller for legged locomotion in challenging natural environments.
arXiv Detail & Related papers (2020-10-21T19:11:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.