NICE-SLAM: Neural Implicit Scalable Encoding for SLAM
- URL: http://arxiv.org/abs/2112.12130v1
- Date: Wed, 22 Dec 2021 18:45:44 GMT
- Title: NICE-SLAM: Neural Implicit Scalable Encoding for SLAM
- Authors: Zihan Zhu, Songyou Peng, Viktor Larsson, Weiwei Xu, Hujun Bao,
Zhaopeng Cui, Martin R. Oswald, Marc Pollefeys
- Abstract summary: NICE-SLAM is a dense SLAM system that incorporates multi-level local information by introducing a hierarchical scene representation.
Compared to recent neural implicit SLAM systems, our approach is more scalable, efficient, and robust.
- Score: 112.6093688226293
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural implicit representations have recently shown encouraging results in
various domains, including promising progress in simultaneous localization and
mapping (SLAM). Nevertheless, existing methods produce over-smoothed scene
reconstructions and have difficulty scaling up to large scenes. These
limitations are mainly due to their simple fully-connected network architecture
that does not incorporate local information in the observations. In this paper,
we present NICE-SLAM, a dense SLAM system that incorporates multi-level local
information by introducing a hierarchical scene representation. Optimizing this
representation with pre-trained geometric priors enables detailed
reconstruction on large indoor scenes. Compared to recent neural implicit SLAM
systems, our approach is more scalable, efficient, and robust. Experiments on
five challenging datasets demonstrate competitive results of NICE-SLAM in both
mapping and tracking quality.
Related papers
- DF-SLAM: Dictionary Factors Representation for High-Fidelity Neural Implicit Dense Visual SLAM System [4.498270578781014]
We introduce a high-fidelity neural implicit dense visual Simultaneous localization and Mapping (SLAM) system, termed DF-SLAM.
In our work, we employ dictionary factors for scene representation, encoding the geometry and appearance information of the scene as a combination of basis and coefficient factors.
Our method exhibits superior scene detail reconstruction capabilities and more efficient memory usage, while our model size is insensitive to the size of the scene map.
arXiv Detail & Related papers (2024-04-27T12:19:23Z) - DVN-SLAM: Dynamic Visual Neural SLAM Based on Local-Global Encoding [15.035343166377256]
We propose a real-time dynamic visual SLAM system based on local-global fusion neural implicit representation.
The proposed DVN-SLAM achieves competitive performance in localization and mapping across multiple datasets.
arXiv Detail & Related papers (2024-03-18T13:34:22Z) - PLGSLAM: Progressive Neural Scene Represenation with Local to Global Bundle Adjustment [24.05634277422078]
We introduce PLGSLAM, a neural visual SLAM system capable of high-fidelity surface reconstruction and robust camera tracking in real-time.
We show that PLGSLAM achieves state-of-the-art scene reconstruction results and tracking performance across various datasets and scenarios.
arXiv Detail & Related papers (2023-12-15T15:09:30Z) - DNS SLAM: Dense Neural Semantic-Informed SLAM [92.39687553022605]
DNS SLAM is a novel neural RGB-D semantic SLAM approach featuring a hybrid representation.
Our method integrates multi-view geometry constraints with image-based feature extraction to improve appearance details.
Our experimental results achieve state-of-the-art performance on both synthetic data and real-world data tracking.
arXiv Detail & Related papers (2023-11-30T21:34:44Z) - CP-SLAM: Collaborative Neural Point-based SLAM System [54.916578456416204]
This paper presents a collaborative implicit neural localization and mapping (SLAM) system with RGB-D image sequences.
In order to enable all these modules in a unified framework, we propose a novel neural point based 3D scene representation.
A distributed-to-centralized learning strategy is proposed for the collaborative implicit SLAM to improve consistency and cooperation.
arXiv Detail & Related papers (2023-11-14T09:17:15Z) - NICE-SLAM with Adaptive Feature Grids [1.5962515374223873]
NICE-SLAM is a dense visual SLAM system that combines neural implicit representations and hierarchical grid-based scene representation.
We present sparse NICE-SLAM, a sparse SLAM system incorporating the idea of Voxel Hashing into NICE-SLAM framework.
arXiv Detail & Related papers (2023-06-04T16:11:45Z) - Fast Monocular Scene Reconstruction with Global-Sparse Local-Dense Grids [84.90863397388776]
We propose to directly use signed distance function (SDF) in sparse voxel block grids for fast and accurate scene reconstruction without distances.
Our globally sparse and locally dense data structure exploits surfaces' spatial sparsity, enables cache-friendly queries, and allows direct extensions to multi-modal data.
Experiments show that our approach is 10x faster in training and 100x faster in rendering while achieving comparable accuracy to state-of-the-art neural implicit methods.
arXiv Detail & Related papers (2023-05-22T16:50:19Z) - Point-SLAM: Dense Neural Point Cloud-based SLAM [61.96492935210654]
We propose a dense neural simultaneous localization and mapping (SLAM) approach for monocular RGBD input.
We demonstrate that both tracking and mapping can be performed with the same point-based neural scene representation.
arXiv Detail & Related papers (2023-04-09T16:48:26Z) - NICER-SLAM: Neural Implicit Scene Encoding for RGB SLAM [111.83168930989503]
NICER-SLAM is a dense RGB SLAM system that simultaneously optimize for camera poses and a hierarchical neural implicit map representation.
We show strong performance in dense mapping, tracking, and novel view synthesis, even competitive with recent RGB-D SLAM systems.
arXiv Detail & Related papers (2023-02-07T17:06:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.