Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural
Real-Time SLAM
- URL: http://arxiv.org/abs/2304.14377v1
- Date: Thu, 27 Apr 2023 17:46:45 GMT
- Title: Co-SLAM: Joint Coordinate and Sparse Parametric Encodings for Neural
Real-Time SLAM
- Authors: Hengyi Wang, Jingwen Wang, Lourdes Agapito
- Abstract summary: Co-SLAM is an RGB-D SLAM system based on a hybrid representation.
It performs robust camera tracking and high-fidelity surface reconstruction in real time.
- Score: 14.56883275492083
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present Co-SLAM, a neural RGB-D SLAM system based on a hybrid
representation, that performs robust camera tracking and high-fidelity surface
reconstruction in real time. Co-SLAM represents the scene as a multi-resolution
hash-grid to exploit its high convergence speed and ability to represent
high-frequency local features. In addition, Co-SLAM incorporates one-blob
encoding, to encourage surface coherence and completion in unobserved areas.
This joint parametric-coordinate encoding enables real-time and robust
performance by bringing the best of both worlds: fast convergence and surface
hole filling. Moreover, our ray sampling strategy allows Co-SLAM to perform
global bundle adjustment over all keyframes instead of requiring keyframe
selection to maintain a small number of active keyframes as competing neural
SLAM approaches do. Experimental results show that Co-SLAM runs at 10-17Hz and
achieves state-of-the-art scene reconstruction results, and competitive
tracking performance in various datasets and benchmarks (ScanNet, TUM, Replica,
Synthetic RGBD). Project page: https://hengyiwang.github.io/projects/CoSLAM
Related papers
- EvenNICER-SLAM: Event-based Neural Implicit Encoding SLAM [69.83383687049994]
We propose EvenNICER-SLAM, a novel approach to dense visual simultaneous localization and mapping.
EvenNICER-SLAM incorporates event cameras that respond to intensity changes instead of absolute brightness.
Our results suggest the potential for event cameras to improve the robustness of dense SLAM systems against fast camera motion in real-world scenarios.
arXiv Detail & Related papers (2024-10-04T13:52:01Z) - GlORIE-SLAM: Globally Optimized RGB-only Implicit Encoding Point Cloud SLAM [53.6402869027093]
We propose an efficient RGB-only dense SLAM system using a flexible neural point cloud representation scene.
We also introduce a novel DSPO layer for bundle adjustment which optimize the pose and depth of implicits along with the scale of the monocular depth.
arXiv Detail & Related papers (2024-03-28T16:32:06Z) - MUTE-SLAM: Real-Time Neural SLAM with Multiple Tri-Plane Hash Representations [6.266208986510979]
MUTE-SLAM is a real-time neural RGB-D SLAM system employing multiple tri-plane hash-encodings for efficient scene representation.
MUTE-SLAM effectively tracks camera positions and incrementally builds a scalable multi-map representation for both small and large indoor environments.
arXiv Detail & Related papers (2024-03-26T14:53:24Z) - DNS SLAM: Dense Neural Semantic-Informed SLAM [92.39687553022605]
DNS SLAM is a novel neural RGB-D semantic SLAM approach featuring a hybrid representation.
Our method integrates multi-view geometry constraints with image-based feature extraction to improve appearance details.
Our experimental results achieve state-of-the-art performance on both synthetic data and real-world data tracking.
arXiv Detail & Related papers (2023-11-30T21:34:44Z) - CP-SLAM: Collaborative Neural Point-based SLAM System [54.916578456416204]
This paper presents a collaborative implicit neural localization and mapping (SLAM) system with RGB-D image sequences.
In order to enable all these modules in a unified framework, we propose a novel neural point based 3D scene representation.
A distributed-to-centralized learning strategy is proposed for the collaborative implicit SLAM to improve consistency and cooperation.
arXiv Detail & Related papers (2023-11-14T09:17:15Z) - NICER-SLAM: Neural Implicit Scene Encoding for RGB SLAM [111.83168930989503]
NICER-SLAM is a dense RGB SLAM system that simultaneously optimize for camera poses and a hierarchical neural implicit map representation.
We show strong performance in dense mapping, tracking, and novel view synthesis, even competitive with recent RGB-D SLAM systems.
arXiv Detail & Related papers (2023-02-07T17:06:34Z) - ESLAM: Efficient Dense SLAM System Based on Hybrid Representation of
Signed Distance Fields [2.0625936401496237]
ESLAM reads RGB-D frames with unknown camera poses in a sequential manner and incrementally reconstructs the scene representation.
ESLAM improves the accuracy of 3D reconstruction and camera localization of state-of-the-art dense visual SLAM methods by more than 50%.
arXiv Detail & Related papers (2022-11-21T18:25:14Z) - NICE-SLAM: Neural Implicit Scalable Encoding for SLAM [112.6093688226293]
NICE-SLAM is a dense SLAM system that incorporates multi-level local information by introducing a hierarchical scene representation.
Compared to recent neural implicit SLAM systems, our approach is more scalable, efficient, and robust.
arXiv Detail & Related papers (2021-12-22T18:45:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.