ESLAM: Efficient Dense SLAM System Based on Hybrid Representation of
Signed Distance Fields
- URL: http://arxiv.org/abs/2211.11704v2
- Date: Mon, 3 Apr 2023 08:54:42 GMT
- Title: ESLAM: Efficient Dense SLAM System Based on Hybrid Representation of
Signed Distance Fields
- Authors: Mohammad Mahdi Johari, Camilla Carta, Fran\c{c}ois Fleuret
- Abstract summary: ESLAM reads RGB-D frames with unknown camera poses in a sequential manner and incrementally reconstructs the scene representation.
ESLAM improves the accuracy of 3D reconstruction and camera localization of state-of-the-art dense visual SLAM methods by more than 50%.
- Score: 2.0625936401496237
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present ESLAM, an efficient implicit neural representation method for
Simultaneous Localization and Mapping (SLAM). ESLAM reads RGB-D frames with
unknown camera poses in a sequential manner and incrementally reconstructs the
scene representation while estimating the current camera position in the scene.
We incorporate the latest advances in Neural Radiance Fields (NeRF) into a SLAM
system, resulting in an efficient and accurate dense visual SLAM method. Our
scene representation consists of multi-scale axis-aligned perpendicular feature
planes and shallow decoders that, for each point in the continuous space,
decode the interpolated features into Truncated Signed Distance Field (TSDF)
and RGB values. Our extensive experiments on three standard datasets, Replica,
ScanNet, and TUM RGB-D show that ESLAM improves the accuracy of 3D
reconstruction and camera localization of state-of-the-art dense visual SLAM
methods by more than 50%, while it runs up to 10 times faster and does not
require any pre-training.
Related papers
- IG-SLAM: Instant Gaussian SLAM [6.228980850646457]
3D Gaussian Splatting has recently shown promising results as an alternative scene representation in SLAM systems.
We present IG-SLAM, a dense RGB-only SLAM system that employs robust Dense-SLAM methods for tracking and combines them with Gaussian Splatting.
We demonstrate competitive performance with state-of-the-art RGB-only SLAM systems while achieving faster operation speeds.
arXiv Detail & Related papers (2024-08-02T09:07:31Z) - NIS-SLAM: Neural Implicit Semantic RGB-D SLAM for 3D Consistent Scene Understanding [31.56016043635702]
We introduce NIS-SLAM, an efficient neural implicit semantic RGB-D SLAM system.
For high-fidelity surface reconstruction and spatial consistent scene understanding, we combine high-frequency multi-resolution tetrahedron-based features.
We also show that our approach can be used in augmented reality applications.
arXiv Detail & Related papers (2024-07-30T14:27:59Z) - Splat-SLAM: Globally Optimized RGB-only SLAM with 3D Gaussians [87.48403838439391]
3D Splatting has emerged as a powerful representation of geometry and appearance for RGB-only dense Simultaneous SLAM.
We propose the first RGB-only SLAM system with a dense 3D Gaussian map representation.
Our experiments on the Replica, TUM-RGBD, and ScanNet datasets indicate the effectiveness of globally optimized 3D Gaussians.
arXiv Detail & Related papers (2024-05-26T12:26:54Z) - MM3DGS SLAM: Multi-modal 3D Gaussian Splatting for SLAM Using Vision, Depth, and Inertial Measurements [59.70107451308687]
We show for the first time that using 3D Gaussians for map representation with unposed camera images and inertial measurements can enable accurate SLAM.
Our method, MM3DGS, addresses the limitations of prior rendering by enabling faster scale awareness, and improved trajectory tracking.
We also release a multi-modal dataset, UT-MM, collected from a mobile robot equipped with a camera and an inertial measurement unit.
arXiv Detail & Related papers (2024-04-01T04:57:41Z) - RGBD GS-ICP SLAM [1.3108652488669732]
We propose a novel dense representation SLAM approach with a fusion of Generalized Iterative Closest Point (G-ICP) and 3D Gaussian Splatting (3DGS)
Experimental results demonstrate the effectiveness of our approach, showing an incredibly fast speed up to 107 FPS.
arXiv Detail & Related papers (2024-03-19T08:49:48Z) - SplaTAM: Splat, Track & Map 3D Gaussians for Dense RGB-D SLAM [48.190398577764284]
SplaTAM is an approach to enable high-fidelity reconstruction from a single unposed RGB-D camera.
It employs a simple online tracking and mapping system tailored to the underlying Gaussian representation.
Experiments show that SplaTAM achieves up to 2x superior performance in camera pose estimation, map construction, and novel-view synthesis over existing methods.
arXiv Detail & Related papers (2023-12-04T18:53:24Z) - DNS SLAM: Dense Neural Semantic-Informed SLAM [92.39687553022605]
DNS SLAM is a novel neural RGB-D semantic SLAM approach featuring a hybrid representation.
Our method integrates multi-view geometry constraints with image-based feature extraction to improve appearance details.
Our experimental results achieve state-of-the-art performance on both synthetic data and real-world data tracking.
arXiv Detail & Related papers (2023-11-30T21:34:44Z) - Photo-SLAM: Real-time Simultaneous Localization and Photorealistic Mapping for Monocular, Stereo, and RGB-D Cameras [27.543561055868697]
Photo-SLAM is a novel SLAM framework with a hyper primitives map.
We exploit explicit geometric features for localization and learn implicit photometric features to represent the texture information of the observed environment.
Our proposed system significantly outperforms current state-of-the-art SLAM systems for online photorealistic mapping.
arXiv Detail & Related papers (2023-11-28T12:19:00Z) - GS-SLAM: Dense Visual SLAM with 3D Gaussian Splatting [51.96353586773191]
We introduce textbfGS-SLAM that first utilizes 3D Gaussian representation in the Simultaneous Localization and Mapping system.
Our method utilizes a real-time differentiable splatting rendering pipeline that offers significant speedup to map optimization and RGB-D rendering.
Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the Replica, TUM-RGBD datasets.
arXiv Detail & Related papers (2023-11-20T12:08:23Z) - NICER-SLAM: Neural Implicit Scene Encoding for RGB SLAM [111.83168930989503]
NICER-SLAM is a dense RGB SLAM system that simultaneously optimize for camera poses and a hierarchical neural implicit map representation.
We show strong performance in dense mapping, tracking, and novel view synthesis, even competitive with recent RGB-D SLAM systems.
arXiv Detail & Related papers (2023-02-07T17:06:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.