EgoLocate: Real-time Motion Capture, Localization, and Mapping with
Sparse Body-mounted Sensors
- URL: http://arxiv.org/abs/2305.01599v1
- Date: Tue, 2 May 2023 16:56:53 GMT
- Title: EgoLocate: Real-time Motion Capture, Localization, and Mapping with
Sparse Body-mounted Sensors
- Authors: Xinyu Yi, Yuxiao Zhou, Marc Habermann, Vladislav Golyanik, Shaohua
Pan, Christian Theobalt, Feng Xu
- Abstract summary: We develop a system that simultaneously performs human motion capture (mocap), localization, and mapping in real time from sparse body-mounted sensors.
Our technique is largely improved by our technique, compared with the state of the art of the two fields.
- Score: 74.1275051763006
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Human and environment sensing are two important topics in Computer Vision and
Graphics. Human motion is often captured by inertial sensors, while the
environment is mostly reconstructed using cameras. We integrate the two
techniques together in EgoLocate, a system that simultaneously performs human
motion capture (mocap), localization, and mapping in real time from sparse
body-mounted sensors, including 6 inertial measurement units (IMUs) and a
monocular phone camera. On one hand, inertial mocap suffers from large
translation drift due to the lack of the global positioning signal. EgoLocate
leverages image-based simultaneous localization and mapping (SLAM) techniques
to locate the human in the reconstructed scene. On the other hand, SLAM often
fails when the visual feature is poor. EgoLocate involves inertial mocap to
provide a strong prior for the camera motion. Experiments show that
localization, a key challenge for both two fields, is largely improved by our
technique, compared with the state of the art of the two fields. Our codes are
available for research at https://xinyu-yi.github.io/EgoLocate/.
Related papers
- EgoHDM: An Online Egocentric-Inertial Human Motion Capture, Localization, and Dense Mapping System [11.89252820871709]
We present EgoHDM, an online egocentric-inertial human motion capture (mocap), localization, and dense mapping system.
Our system uses 6 inertial measurement units (IMUs) and a commodity head-mounted RGB camera.
arXiv Detail & Related papers (2024-08-31T04:19:02Z) - Synergistic Global-space Camera and Human Reconstruction from Videos [41.309293977251855]
This work introduces Synergistic Camera and Human Reconstruction (SynSynR) to marry the best of both worlds.
Specifically, we design Human-aware Metric CHM to reconstruct metric-scale camera poses and scene point clouds.
We further learn a Scene-aware SMPL Denoiser to enhance world-frame HMR by incorporating dense-temporal coherency and dynamic scene constraints.
arXiv Detail & Related papers (2024-05-23T17:57:50Z) - SparsePoser: Real-time Full-body Motion Reconstruction from Sparse Data [1.494051815405093]
We introduce SparsePoser, a novel deep learning-based solution for reconstructing a full-body pose from sparse data.
Our system incorporates a convolutional-based autoencoder that synthesizes high-quality continuous human poses.
We show that our method outperforms state-of-the-art techniques using IMU sensors or 6-DoF tracking devices.
arXiv Detail & Related papers (2023-11-03T18:48:01Z) - Fusing Monocular Images and Sparse IMU Signals for Real-time Human
Motion Capture [8.125716139367142]
We propose a method that fuses monocular images and sparse IMUs for real-time human motion capture.
Our method contains a dual coordinate strategy to fully explore the IMU signals with different goals in motion capture.
Our technique significantly outperforms the state-of-the-art vision, IMU, and combined methods on both global orientation and local pose estimation.
arXiv Detail & Related papers (2023-09-01T07:52:08Z) - TRACE: 5D Temporal Regression of Avatars with Dynamic Cameras in 3D
Environments [106.80978555346958]
Current methods can't reliably estimate moving humans in global coordinates.
TRACE is the first one-stage method to jointly recover and track 3D humans in global coordinates from dynamic cameras.
It achieves state-of-the-art performance on tracking and HPS benchmarks.
arXiv Detail & Related papers (2023-06-05T13:00:44Z) - HULC: 3D Human Motion Capture with Pose Manifold Sampling and Dense
Contact Guidance [82.09463058198546]
Marker-less monocular 3D human motion capture (MoCap) with scene interactions is a challenging research topic relevant for extended reality, robotics and virtual avatar generation.
We propose HULC, a new approach for 3D human MoCap which is aware of the scene geometry.
arXiv Detail & Related papers (2022-05-11T17:59:31Z) - Neural Monocular 3D Human Motion Capture with Physical Awareness [76.55971509794598]
We present a new trainable system for physically plausible markerless 3D human motion capture.
Unlike most neural methods for human motion capture, our approach is aware of physical and environmental constraints.
It produces smooth and physically principled 3D motions in an interactive frame rate in a wide variety of challenging scenes.
arXiv Detail & Related papers (2021-05-03T17:57:07Z) - Human POSEitioning System (HPS): 3D Human Pose Estimation and
Self-localization in Large Scenes from Body-Mounted Sensors [71.29186299435423]
We introduce (HPS) Human POSEitioning System, a method to recover the full 3D pose of a human registered with a 3D scan of the surrounding environment.
We show that our optimization-based integration exploits the benefits of the two, resulting in pose accuracy free of drift.
HPS could be used for VR/AR applications where humans interact with the scene without requiring direct line of sight with an external camera.
arXiv Detail & Related papers (2021-03-31T17:58:31Z) - PhysCap: Physically Plausible Monocular 3D Motion Capture in Real Time [89.68248627276955]
Marker-less 3D motion capture from a single colour camera has seen significant progress.
However, it is a very challenging and severely ill-posed problem.
We present PhysCap, the first algorithm for physically plausible, real-time and marker-less human 3D motion capture.
arXiv Detail & Related papers (2020-08-20T10:46:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.