RobustSplat: Decoupling Densification and Dynamics for Transient-Free 3DGS
- URL: http://arxiv.org/abs/2506.02751v3
- Date: Tue, 29 Jul 2025 13:44:19 GMT
- Title: RobustSplat: Decoupling Densification and Dynamics for Transient-Free 3DGS
- Authors: Chuanyu Fu, Yuqi Zhang, Kunbin Yao, Guanying Chen, Yuan Xiong, Chuan Huang, Shuguang Cui, Xiaochun Cao,
- Abstract summary: 3D Gaussian Splatting has gained significant attention for its real-time, photo-realistic rendering in novel-view synthesis and 3D modeling.<n>Existing methods struggle with accurately modeling scenes affected by transient objects, leading to artifacts in the rendered images.<n>We propose RobustSplat, a robust solution based on two critical designs.
- Score: 79.15416002879239
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: 3D Gaussian Splatting (3DGS) has gained significant attention for its real-time, photo-realistic rendering in novel-view synthesis and 3D modeling. However, existing methods struggle with accurately modeling scenes affected by transient objects, leading to artifacts in the rendered images. We identify that the Gaussian densification process, while enhancing scene detail capture, unintentionally contributes to these artifacts by growing additional Gaussians that model transient disturbances. To address this, we propose RobustSplat, a robust solution based on two critical designs. First, we introduce a delayed Gaussian growth strategy that prioritizes optimizing static scene structure before allowing Gaussian splitting/cloning, mitigating overfitting to transient objects in early optimization. Second, we design a scale-cascaded mask bootstrapping approach that first leverages lower-resolution feature similarity supervision for reliable initial transient mask estimation, taking advantage of its stronger semantic consistency and robustness to noise, and then progresses to high-resolution supervision to achieve more precise mask prediction. Extensive experiments on multiple challenging datasets show that our method outperforms existing methods, clearly demonstrating the robustness and effectiveness of our method. Our project page is https://fcyycf.github.io/RobustSplat/.
Related papers
- Low-Frequency First: Eliminating Floating Artifacts in 3D Gaussian Splatting [22.626200397052862]
3D Gaussian Splatting (3DGS) is a powerful representation for 3D reconstruction.<n>3DGS often produces floating artifacts, which are erroneous structures detached from the actual geometry.<n>We propose EFA-GS, which selectively expands under-optimized Gaussians to prioritize accurate low-frequency learning.
arXiv Detail & Related papers (2025-08-04T15:03:56Z) - Hi^2-GSLoc: Dual-Hierarchical Gaussian-Specific Visual Relocalization for Remote Sensing [6.997091164331322]
Visual relocalization is fundamental to remote sensing and UAV applications.<n>Existing methods face inherent trade-offs: image-based retrieval and pose regression approaches lack precision.<n>We introduce $mathrmHi2$-GSLoc, a dual-hierarchical relocalization framework that follows a sparse-to-dense and coarse-to-fine paradigm.
arXiv Detail & Related papers (2025-07-21T14:47:56Z) - Intern-GS: Vision Model Guided Sparse-View 3D Gaussian Splatting [95.61137026932062]
Intern-GS is a novel approach to enhance the process of sparse-view Gaussian splatting.<n>We show that Intern-GS achieves state-of-the-art rendering quality across diverse datasets.
arXiv Detail & Related papers (2025-05-27T05:17:49Z) - Steepest Descent Density Control for Compact 3D Gaussian Splatting [72.54055499344052]
3D Gaussian Splatting (3DGS) has emerged as a powerful real-time, high-resolution novel view.<n>We propose a theoretical framework that demystifies and improves density control in 3DGS.<n>We introduce SteepGS, incorporating steepest density control, a principled strategy that minimizes loss while maintaining a compact point cloud.
arXiv Detail & Related papers (2025-05-08T18:41:38Z) - Uncertainty-Aware Normal-Guided Gaussian Splatting for Surface Reconstruction from Sparse Image Sequences [21.120659841877508]
3D Gaussian Splatting (3DGS) has achieved impressive rendering performance in novel view synthesis.<n>We propose Uncertainty-aware Normal-Guided Gaussian Splatting (UNG-GS) to quantify geometric uncertainty within the 3DGS pipeline.<n>UNG-GS significantly outperforms state-of-the-art methods in both sparse and dense sequences.
arXiv Detail & Related papers (2025-03-14T08:18:12Z) - ForestSplats: Deformable transient field for Gaussian Splatting in the Wild [7.260126436331847]
3D-GS shows effectiveness in static scenes, but their performance significantly degrades in real-world environments.<n>We propose ForestSplats, a novel approach that leverages the deformable transient field and a superpixel-aware mask.
arXiv Detail & Related papers (2025-03-08T11:44:57Z) - Mini-Splatting2: Building 360 Scenes within Minutes via Aggressive Gaussian Densification [4.733612131945549]
Mini-Splatting2 achieves a balanced trade-off among optimization time, the number of Gaussians, and rendering quality.
Our work sets the stage for more efficient, high-quality 3D scene modeling in real-world applications.
arXiv Detail & Related papers (2024-11-19T11:47:40Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Binocular-Guided 3D Gaussian Splatting with View Consistency for Sparse View Synthesis [53.702118455883095]
We propose a novel method for synthesizing novel views from sparse views with Gaussian Splatting.
Our key idea lies in exploring the self-supervisions inherent in the binocular stereo consistency between each pair of binocular images.
Our method significantly outperforms the state-of-the-art methods.
arXiv Detail & Related papers (2024-10-24T15:10:27Z) - Robust 3D Gaussian Splatting for Novel View Synthesis in Presence of Distractors [44.55317154371679]
3D Gaussian Splatting has shown impressive novel view synthesis results.
It is vulnerable to dynamic objects polluting the input data of an otherwise static scene, so called distractors.
We show that our approach is robust to various distractors and strongly improves rendering quality on distractor-polluted scenes.
arXiv Detail & Related papers (2024-08-21T15:21:27Z) - GS-SLAM: Dense Visual SLAM with 3D Gaussian Splatting [51.96353586773191]
We introduce textbfGS-SLAM that first utilizes 3D Gaussian representation in the Simultaneous Localization and Mapping system.
Our method utilizes a real-time differentiable splatting rendering pipeline that offers significant speedup to map optimization and RGB-D rendering.
Our method achieves competitive performance compared with existing state-of-the-art real-time methods on the Replica, TUM-RGBD datasets.
arXiv Detail & Related papers (2023-11-20T12:08:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.