Dense RGB-D-Inertial SLAM with Map Deformations
- URL: http://arxiv.org/abs/2207.10940v1
- Date: Fri, 22 Jul 2022 08:33:38 GMT
- Title: Dense RGB-D-Inertial SLAM with Map Deformations
- Authors: Tristan Laidlow, Michael Bloesch, Wenbin Li, Stefan Leutenegger
- Abstract summary: We propose the first tightly-coupled dense RGB-D-inertial SLAM system.
We show that our system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.
- Score: 25.03159756734727
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: While dense visual SLAM methods are capable of estimating dense
reconstructions of the environment, they suffer from a lack of robustness in
their tracking step, especially when the optimisation is poorly initialised.
Sparse visual SLAM systems have attained high levels of accuracy and robustness
through the inclusion of inertial measurements in a tightly-coupled fusion.
Inspired by this performance, we propose the first tightly-coupled dense
RGB-D-inertial SLAM system.
Our system has real-time capability while running on a GPU. It jointly
optimises for the camera pose, velocity, IMU biases and gravity direction while
building up a globally consistent, fully dense surfel-based 3D reconstruction
of the environment. Through a series of experiments on both synthetic and real
world datasets, we show that our dense visual-inertial SLAM system is more
robust to fast motions and periods of low texture and low geometric variation
than a related RGB-D-only SLAM system.
Related papers
- MGSO: Monocular Real-time Photometric SLAM with Efficient 3D Gaussian Splatting [8.577428137443246]
We present Monocular GSO, a novel real-time SLAM system that integrates photometric SLAM with 3DGS.
Our system generates reconstructions with a balance of quality, memory efficiency, and speed that outperforms the state-of-the-art.
Not only do we surpass contemporary systems, but experiments also show that we maintain our performance on laptop hardware.
arXiv Detail & Related papers (2024-09-19T19:07:05Z) - IG-SLAM: Instant Gaussian SLAM [6.228980850646457]
3D Gaussian Splatting has recently shown promising results as an alternative scene representation in SLAM systems.
We present IG-SLAM, a dense RGB-only SLAM system that employs robust Dense-SLAM methods for tracking and combines them with Gaussian Splatting.
We demonstrate competitive performance with state-of-the-art RGB-only SLAM systems while achieving faster operation speeds.
arXiv Detail & Related papers (2024-08-02T09:07:31Z) - Splat-SLAM: Globally Optimized RGB-only SLAM with 3D Gaussians [87.48403838439391]
3D Splatting has emerged as a powerful representation of geometry and appearance for RGB-only dense Simultaneous SLAM.
We propose the first RGB-only SLAM system with a dense 3D Gaussian map representation.
Our experiments on the Replica, TUM-RGBD, and ScanNet datasets indicate the effectiveness of globally optimized 3D Gaussians.
arXiv Detail & Related papers (2024-05-26T12:26:54Z) - GlORIE-SLAM: Globally Optimized RGB-only Implicit Encoding Point Cloud SLAM [53.6402869027093]
We propose an efficient RGB-only dense SLAM system using a flexible neural point cloud representation scene.
We also introduce a novel DSPO layer for bundle adjustment which optimize the pose and depth of implicits along with the scale of the monocular depth.
arXiv Detail & Related papers (2024-03-28T16:32:06Z) - Gaussian Splatting SLAM [16.3858380078553]
We present the first application of 3D Gaussian Splatting in monocular SLAM.
Our method runs live at 3fps, unifying the required representation for accurate tracking, mapping, and high-quality rendering.
Several innovations are required to continuously reconstruct 3D scenes with high fidelity from a live camera.
arXiv Detail & Related papers (2023-12-11T18:19:04Z) - DNS SLAM: Dense Neural Semantic-Informed SLAM [92.39687553022605]
DNS SLAM is a novel neural RGB-D semantic SLAM approach featuring a hybrid representation.
Our method integrates multi-view geometry constraints with image-based feature extraction to improve appearance details.
Our experimental results achieve state-of-the-art performance on both synthetic data and real-world data tracking.
arXiv Detail & Related papers (2023-11-30T21:34:44Z) - GO-SLAM: Global Optimization for Consistent 3D Instant Reconstruction [45.49960166785063]
GO-SLAM is a deep-learning-based dense visual SLAM framework globally optimizing poses and 3D reconstruction in real-time.
Results on various synthetic and real-world datasets demonstrate that GO-SLAM outperforms state-of-the-art approaches at tracking robustness and reconstruction accuracy.
arXiv Detail & Related papers (2023-09-05T17:59:58Z) - NICER-SLAM: Neural Implicit Scene Encoding for RGB SLAM [111.83168930989503]
NICER-SLAM is a dense RGB SLAM system that simultaneously optimize for camera poses and a hierarchical neural implicit map representation.
We show strong performance in dense mapping, tracking, and novel view synthesis, even competitive with recent RGB-D SLAM systems.
arXiv Detail & Related papers (2023-02-07T17:06:34Z) - Pseudo RGB-D for Self-Improving Monocular SLAM and Depth Prediction [72.30870535815258]
CNNs for monocular depth prediction represent two largely disjoint approaches towards building a 3D map of the surrounding environment.
We propose a joint narrow and wide baseline based self-improving framework, where on the one hand the CNN-predicted depth is leveraged to perform pseudo RGB-D feature-based SLAM.
On the other hand, the bundle-adjusted 3D scene structures and camera poses from the more principled geometric SLAM are injected back into the depth network through novel wide baseline losses.
arXiv Detail & Related papers (2020-04-22T16:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.