VIGS-SLAM: Visual Inertial Gaussian Splatting SLAM
- URL: http://arxiv.org/abs/2512.02293v1
- Date: Tue, 02 Dec 2025 00:19:13 GMT
- Title: VIGS-SLAM: Visual Inertial Gaussian Splatting SLAM
- Authors: Zihan Zhu, Wei Zhang, Norbert Haala, Marc Pollefeys, Daniel Barath,
- Abstract summary: We present VIGS-SLAM, a visual-inertial 3D Gaussian Splatting SLAM system.<n>It achieves robust real-time tracking and high-fidelity reconstruction.<n>Our method tightly couples visual and inertial cues within a unified optimization framework.
- Score: 75.55522219717137
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present VIGS-SLAM, a visual-inertial 3D Gaussian Splatting SLAM system that achieves robust real-time tracking and high-fidelity reconstruction. Although recent 3DGS-based SLAM methods achieve dense and photorealistic mapping, their purely visual design degrades under motion blur, low texture, and exposure variations. Our method tightly couples visual and inertial cues within a unified optimization framework, jointly refining camera poses, depths, and IMU states. It features robust IMU initialization, time-varying bias modeling, and loop closure with consistent Gaussian updates. Experiments on four challenging datasets demonstrate our superiority over state-of-the-art methods. Project page: https://vigs-slam.github.io
Related papers
- FeatureSLAM: Feature-enriched 3D gaussian splatting SLAM in real time [11.883404434697809]
We present a real-time tracking SLAM system that unifies efficient camera tracking and feature-enriched mapping using 3D Gaussian Splatting (3DGS)<n>Our main contribution is integrating dense feature semanticization with semanticization into the novel-view synthesis aligned with a visual foundation model.<n>We obtain 9% lower pose error and 8% higher mapping accuracy compared to recent fixed-set SLAM baselines.
arXiv Detail & Related papers (2026-01-09T11:40:16Z) - Pseudo Depth Meets Gaussian: A Feed-forward RGB SLAM Baseline [64.42938561167402]
We propose an online 3D reconstruction method using 3D Gaussian-based SLAM, combined with a feed-forward recurrent prediction module.<n>This approach replaces slow test-time optimization with fast network inference, significantly improving tracking speed.<n>Our method achieves performance on par with the state-of-the-art SplaTAM, while reducing tracking time by more than 90%.
arXiv Detail & Related papers (2025-08-06T16:16:58Z) - LEG-SLAM: Real-Time Language-Enhanced Gaussian Splatting for SLAM [0.0]
LEG-SLAM is a novel approach that fuses an optimized Gaussian Splatting implementation with visual-language feature extraction.<n>Our method simultaneously generates high-quality photorealistic images and semantically labeled scene maps.<n>With its potential applications in autonomous robotics, augmented reality, and other interactive domains, LEG-SLAM represents a significant step forward in real-time semantic 3D Gaussian-based SLAM.
arXiv Detail & Related papers (2025-06-03T16:51:59Z) - WildGS-SLAM: Monocular Gaussian Splatting SLAM in Dynamic Environments [48.51530726697405]
We present WildGS-SLAM, a robust and efficient monocular RGB SLAM system designed to handle dynamic environments.<n>We introduce an uncertainty map, predicted by a shallow multi-layer perceptron and DINOv2 features, to guide dynamic object removal during both tracking and mapping.<n>Results showcase WildGS-SLAM's superior performance in dynamic environments compared to state-of-the-art methods.
arXiv Detail & Related papers (2025-04-04T19:19:40Z) - EVolSplat: Efficient Volume-based Gaussian Splatting for Urban View Synthesis [61.1662426227688]
Existing NeRF and 3DGS-based methods show promising results in achieving photorealistic renderings but require slow, per-scene optimization.<n>We introduce EVolSplat, an efficient 3D Gaussian Splatting model for urban scenes that works in a feed-forward manner.
arXiv Detail & Related papers (2025-03-26T02:47:27Z) - GI-SLAM: Gaussian-Inertial SLAM [2.186901738997927]
3D Gaussian Splatting (3DGS) has emerged as a powerful representation of geometry and appearance for dense Simultaneous Localization and Mapping (SLAM)<n>We present GI-SLAM, a novel gaussian-inertial SLAM system which consists of an IMU-enhanced camera tracking module and a realistic 3D Gaussian-based representation for mapping.
arXiv Detail & Related papers (2025-03-24T01:45:40Z) - FlashSLAM: Accelerated RGB-D SLAM for Real-Time 3D Scene Reconstruction with Gaussian Splatting [14.130327598928778]
FlashSLAM is a novel SLAM approach that leverages 3D Gaussian Splatting for efficient and robust 3D scene reconstruction.<n>Existing 3DGS-based SLAM methods often fall short in sparse view settings and during large camera movements.<n>Our method achieves up to a 92% improvement in average tracking accuracy over previous methods.
arXiv Detail & Related papers (2024-12-01T05:44:38Z) - MM3DGS SLAM: Multi-modal 3D Gaussian Splatting for SLAM Using Vision, Depth, and Inertial Measurements [59.70107451308687]
We show for the first time that using 3D Gaussians for map representation with unposed camera images and inertial measurements can enable accurate SLAM.
Our method, MM3DGS, addresses the limitations of prior rendering by enabling faster scale awareness, and improved trajectory tracking.
We also release a multi-modal dataset, UT-MM, collected from a mobile robot equipped with a camera and an inertial measurement unit.
arXiv Detail & Related papers (2024-04-01T04:57:41Z) - Gaussian Splatting SLAM [16.3858380078553]
We present the first application of 3D Gaussian Splatting in monocular SLAM.
Our method runs live at 3fps, unifying the required representation for accurate tracking, mapping, and high-quality rendering.
Several innovations are required to continuously reconstruct 3D scenes with high fidelity from a live camera.
arXiv Detail & Related papers (2023-12-11T18:19:04Z) - GS-SLAM: Dense Visual SLAM with 3D Gaussian Splatting [51.96353586773191]
We introduce textbfGS-SLAM that first utilizes 3D Gaussian representation in the Simultaneous Localization and Mapping system.
Our method utilizes a real-time differentiable splatting rendering pipeline that offers significant speedup to map optimization and RGB-D rendering.
Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the Replica, TUM-RGBD datasets.
arXiv Detail & Related papers (2023-11-20T12:08:23Z) - Dense RGB-D-Inertial SLAM with Map Deformations [25.03159756734727]
We propose the first tightly-coupled dense RGB-D-inertial SLAM system.
We show that our system is more robust to fast motions and periods of low texture and low geometric variation than a related RGB-D-only SLAM system.
arXiv Detail & Related papers (2022-07-22T08:33:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.