Dense RGB SLAM with Neural Implicit Maps
- URL: http://arxiv.org/abs/2301.08930v1
- Date: Sat, 21 Jan 2023 09:54:07 GMT
- Title: Dense RGB SLAM with Neural Implicit Maps
- Authors: Heng Li, Xiaodong Gu, Weihao Yuan, Luwei Yang, Zilong Dong, Ping Tan
- Abstract summary: We present a dense RGB SLAM method with neural implicit map representation.
Our method simultaneously solves the camera motion and the neural implicit map by matching the rendered and input video frames.
Our method achieves favorable results than previous methods and even surpasses some recent RGB-D SLAM methods.
- Score: 34.37572307973734
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There is an emerging trend of using neural implicit functions for map
representation in Simultaneous Localization and Mapping (SLAM). Some pioneer
works have achieved encouraging results on RGB-D SLAM. In this paper, we
present a dense RGB SLAM method with neural implicit map representation. To
reach this challenging goal without depth input, we introduce a hierarchical
feature volume to facilitate the implicit map decoder. This design effectively
fuses shape cues across different scales to facilitate map reconstruction. Our
method simultaneously solves the camera motion and the neural implicit map by
matching the rendered and input video frames. To facilitate optimization, we
further propose a photometric warping loss in the spirit of multi-view stereo
to better constrain the camera pose and scene geometry. We evaluate our method
on commonly used benchmarks and compare it with modern RGB and RGB-D SLAM
systems. Our method achieves favorable results than previous methods and even
surpasses some recent RGB-D SLAM methods. Our source code will be publicly
available.
Related papers
- IG-SLAM: Instant Gaussian SLAM [6.228980850646457]
3D Gaussian Splatting has recently shown promising results as an alternative scene representation in SLAM systems.
We present IG-SLAM, a dense RGB-only SLAM system that employs robust Dense-SLAM methods for tracking and combines them with Gaussian Splatting.
We demonstrate competitive performance with state-of-the-art RGB-only SLAM systems while achieving faster operation speeds.
arXiv Detail & Related papers (2024-08-02T09:07:31Z) - Splat-SLAM: Globally Optimized RGB-only SLAM with 3D Gaussians [87.48403838439391]
3D Splatting has emerged as a powerful representation of geometry and appearance for RGB-only dense Simultaneous SLAM.
We propose the first RGB-only SLAM system with a dense 3D Gaussian map representation.
Our experiments on the Replica, TUM-RGBD, and ScanNet datasets indicate the effectiveness of globally optimized 3D Gaussians.
arXiv Detail & Related papers (2024-05-26T12:26:54Z) - GlORIE-SLAM: Globally Optimized RGB-only Implicit Encoding Point Cloud SLAM [53.6402869027093]
We propose an efficient RGB-only dense SLAM system using a flexible neural point cloud representation scene.
We also introduce a novel DSPO layer for bundle adjustment which optimize the pose and depth of implicits along with the scale of the monocular depth.
arXiv Detail & Related papers (2024-03-28T16:32:06Z) - Loopy-SLAM: Dense Neural SLAM with Loop Closures [53.11936461015725]
We introduce Loopy-SLAM that globally optimize poses and the dense 3D model.
We use frame-to-model tracking using a data-driven point-based submap generation method and trigger loop closures online by performing global place recognition.
Evaluation on the synthetic Replica and real-world TUM-RGBD and ScanNet datasets demonstrate competitive or superior performance in tracking, mapping, and rendering accuracy when compared to existing dense neural RGBD SLAM methods.
arXiv Detail & Related papers (2024-02-14T18:18:32Z) - DNS SLAM: Dense Neural Semantic-Informed SLAM [92.39687553022605]
DNS SLAM is a novel neural RGB-D semantic SLAM approach featuring a hybrid representation.
Our method integrates multi-view geometry constraints with image-based feature extraction to improve appearance details.
Our experimental results achieve state-of-the-art performance on both synthetic data and real-world data tracking.
arXiv Detail & Related papers (2023-11-30T21:34:44Z) - GS-SLAM: Dense Visual SLAM with 3D Gaussian Splatting [51.96353586773191]
We introduce textbfGS-SLAM that first utilizes 3D Gaussian representation in the Simultaneous Localization and Mapping system.
Our method utilizes a real-time differentiable splatting rendering pipeline that offers significant speedup to map optimization and RGB-D rendering.
Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the Replica, TUM-RGBD datasets.
arXiv Detail & Related papers (2023-11-20T12:08:23Z) - FMapping: Factorized Efficient Neural Field Mapping for Real-Time Dense
RGB SLAM [3.6985351289638957]
We introduce FMapping, an efficient neural field mapping framework that facilitates the continuous estimation of a colorized point cloud map in real-time dense RGB SLAM.
We propose an effective factorization scheme for scene representation and introduce a sliding window strategy to reduce the uncertainty for scene reconstruction.
arXiv Detail & Related papers (2023-06-01T11:51:46Z) - Point-SLAM: Dense Neural Point Cloud-based SLAM [61.96492935210654]
We propose a dense neural simultaneous localization and mapping (SLAM) approach for monocular RGBD input.
We demonstrate that both tracking and mapping can be performed with the same point-based neural scene representation.
arXiv Detail & Related papers (2023-04-09T16:48:26Z) - ESLAM: Efficient Dense SLAM System Based on Hybrid Representation of
Signed Distance Fields [2.0625936401496237]
ESLAM reads RGB-D frames with unknown camera poses in a sequential manner and incrementally reconstructs the scene representation.
ESLAM improves the accuracy of 3D reconstruction and camera localization of state-of-the-art dense visual SLAM methods by more than 50%.
arXiv Detail & Related papers (2022-11-21T18:25:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.