Decomposing Densification in Gaussian Splatting for Faster 3D Scene Reconstruction
- URL: http://arxiv.org/abs/2507.20239v1
- Date: Sun, 27 Jul 2025 11:47:20 GMT
- Title: Decomposing Densification in Gaussian Splatting for Faster 3D Scene Reconstruction
- Authors: Binxiao Huang, Zhengwu Liu, Ngai Wong,
- Abstract summary: 3D Gaussian Splatting (GS) has emerged as a powerful representation for high-quality scene reconstruction, offering compelling rendering quality.<n>We present a comprehensive analysis of the split and clone operations during the densification phase, revealing their roles in balancing detail preservation and computational efficiency.<n>We introduce an energy-guided coarse-to-fine multi-resolution training framework, which gradually increases resolution based on energy density in 2D images.
- Score: 5.929129351088044
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D Gaussian Splatting (GS) has emerged as a powerful representation for high-quality scene reconstruction, offering compelling rendering quality. However, the training process of GS often suffers from slow convergence due to inefficient densification and suboptimal spatial distribution of Gaussian primitives. In this work, we present a comprehensive analysis of the split and clone operations during the densification phase, revealing their distinct roles in balancing detail preservation and computational efficiency. Building upon this analysis, we propose a global-to-local densification strategy, which facilitates more efficient growth of Gaussians across the scene space, promoting both global coverage and local refinement. To cooperate with the proposed densification strategy and promote sufficient diffusion of Gaussian primitives in space, we introduce an energy-guided coarse-to-fine multi-resolution training framework, which gradually increases resolution based on energy density in 2D images. Additionally, we dynamically prune unnecessary Gaussian primitives to speed up the training. Extensive experiments on MipNeRF-360, Deep Blending, and Tanks & Temples datasets demonstrate that our approach significantly accelerates training,achieving over 2x speedup with fewer Gaussian primitives and superior reconstruction performance.
Related papers
- Perceive-Sample-Compress: Towards Real-Time 3D Gaussian Splatting [7.421996491601524]
We introduce a novel perceive-sample-compress framework for 3D Gaussian Splatting.<n>We show that our method significantly improves memory efficiency and high visual quality while maintaining real-time rendering speed.
arXiv Detail & Related papers (2025-08-07T01:34:38Z) - Duplex-GS: Proxy-Guided Weighted Blending for Real-Time Order-Independent Gaussian Splatting [37.17972426764452]
We propose a dual-hierarchy framework that integrates proxy Gaussian representations with order-independent rendering techniques.<n>By seamlessly combining our framework with Order-Independent Transparency (OIT), we develop a physically inspired weighted sum rendering technique that simultaneously eliminates "popping" and "transparency" artifacts.<n>Our results validate the advantages of the OIT rendering paradigm in Gaussian Splatting, achieving high-quality rendering with an impressive 1.5 to 4 speedup over existing OIT based Gaussian Splatting approaches.
arXiv Detail & Related papers (2025-08-05T07:44:30Z) - SD-GS: Structured Deformable 3D Gaussians for Efficient Dynamic Scene Reconstruction [5.818188539758898]
We present SD-GS, a compact and efficient dynamic splatting framework for complex dynamic scene reconstruction.<n>We also present a deformation-aware densification strategy that adaptively grows anchors in under-reconstructed high-dynamic regions.<n> Experimental results demonstrate that SD-GS achieves an average of 60% reduction in model size and an average of 100% improvement in FPS.
arXiv Detail & Related papers (2025-07-10T06:35:03Z) - Steepest Descent Density Control for Compact 3D Gaussian Splatting [72.54055499344052]
3D Gaussian Splatting (3DGS) has emerged as a powerful real-time, high-resolution novel view.<n>We propose a theoretical framework that demystifies and improves density control in 3DGS.<n>We introduce SteepGS, incorporating steepest density control, a principled strategy that minimizes loss while maintaining a compact point cloud.
arXiv Detail & Related papers (2025-05-08T18:41:38Z) - FreeSplat++: Generalizable 3D Gaussian Splatting for Efficient Indoor Scene Reconstruction [50.534213038479926]
FreeSplat++ is an alternative approach to large-scale indoor whole-scene reconstruction.<n>Our method with depth-regularized per-scene fine-tuning demonstrates substantial improvements in reconstruction accuracy and a notable reduction in training time.
arXiv Detail & Related papers (2025-03-29T06:22:08Z) - ProtoGS: Efficient and High-Quality Rendering with 3D Gaussian Prototypes [81.48624894781257]
3D Gaussian Splatting (3DGS) has made significant strides in novel view synthesis but is limited by the substantial number of Gaussian primitives required.<n>Recent methods address this issue by compressing the storage size of densified Gaussians, yet fail to preserve rendering quality and efficiency.<n>We propose ProtoGS to learn Gaussian prototypes to represent Gaussian primitives, significantly reducing the total Gaussian amount without sacrificing visual quality.
arXiv Detail & Related papers (2025-03-21T18:55:14Z) - Mini-Splatting2: Building 360 Scenes within Minutes via Aggressive Gaussian Densification [4.733612131945549]
Mini-Splatting2 achieves a balanced trade-off among optimization time, the number of Gaussians, and rendering quality.
Our work sets the stage for more efficient, high-quality 3D scene modeling in real-world applications.
arXiv Detail & Related papers (2024-11-19T11:47:40Z) - DGTR: Distributed Gaussian Turbo-Reconstruction for Sparse-View Vast Scenes [81.56206845824572]
Novel-view synthesis (NVS) approaches play a critical role in vast scene reconstruction.
Few-shot methods often struggle with poor reconstruction quality in vast environments.
This paper presents DGTR, a novel distributed framework for efficient Gaussian reconstruction for sparse-view vast scenes.
arXiv Detail & Related papers (2024-11-19T07:51:44Z) - Efficient Density Control for 3D Gaussian Splatting [3.6379656024631215]
3D Gaussian Splatting (3DGS) has demonstrated outstanding performance in novel view synthesis.<n>We propose two key innovations: (1) Long-Axis Split, which precisely controls the position, shape, and opacity of child Gaussians; and (2) Recovery-Aware Pruning, which leverages differences in recovery speed after resetting opacity to prune overfitted Gaussians.
arXiv Detail & Related papers (2024-11-15T12:12:56Z) - DyGASR: Dynamic Generalized Exponential Splatting with Surface Alignment for Accelerated 3D Mesh Reconstruction [1.2891210250935148]
We propose DyGASR, which utilizes generalized exponential function instead of traditional 3D Gaussian to decrease the number of particles.
We also introduce Generalized Surface Regularization (GSR), which reduces the smallest scaling vector of each point cloud to zero.
Our approach surpasses existing 3DGS-based mesh reconstruction methods, demonstrating a 25% increase in speed, and a 30% reduction in memory usage.
arXiv Detail & Related papers (2024-11-14T03:19:57Z) - CityGaussianV2: Efficient and Geometrically Accurate Reconstruction for Large-Scale Scenes [53.107474952492396]
CityGaussianV2 is a novel approach for large-scale scene reconstruction.<n>We implement a decomposed-gradient-based densification and depth regression technique to eliminate blurry artifacts and accelerate convergence.<n>Our method strikes a promising balance between visual quality, geometric accuracy, as well as storage and training costs.
arXiv Detail & Related papers (2024-11-01T17:59:31Z) - GES: Generalized Exponential Splatting for Efficient Radiance Field Rendering [112.16239342037714]
GES (Generalized Exponential Splatting) is a novel representation that employs Generalized Exponential Function (GEF) to model 3D scenes.
With the aid of a frequency-modulated loss, GES achieves competitive performance in novel-view synthesis benchmarks.
arXiv Detail & Related papers (2024-02-15T17:32:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.