GP-GS: Gaussian Processes for Enhanced Gaussian Splatting
- URL: http://arxiv.org/abs/2502.02283v5
- Date: Tue, 13 May 2025 15:14:37 GMT
- Title: GP-GS: Gaussian Processes for Enhanced Gaussian Splatting
- Authors: Zhihao Guo, Jingxuan Su, Shenglin Wang, Jinlong Fan, Jing Zhang, Wei Zhou, Hadi Amirpour, Yunlong Zhao, Liangxiu Han, Peng Wang,
- Abstract summary: This paper proposes a novel 3D reconstruction framework, Gaussian Processes enhanced Gaussian Splatting (GP-GS)<n>GP-GS enables adaptive and uncertainty-guided densification of sparse Structure-from-Motion point clouds.<n>Experiments conducted on synthetic and real-world datasets validate the effectiveness and practicality of the proposed framework.
- Score: 15.263608848427136
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: 3D Gaussian Splatting has emerged as an efficient photorealistic novel view synthesis method. However, its reliance on sparse Structure-from-Motion (SfM) point clouds often limits scene reconstruction quality. To address the limitation, this paper proposes a novel 3D reconstruction framework, Gaussian Processes enhanced Gaussian Splatting (GP-GS), in which a multi-output Gaussian Process model is developed to enable adaptive and uncertainty-guided densification of sparse SfM point clouds. Specifically, we propose a dynamic sampling and filtering pipeline that adaptively expands the SfM point clouds by leveraging GP-based predictions to infer new candidate points from the input 2D pixels and depth maps. The pipeline utilizes uncertainty estimates to guide the pruning of high-variance predictions, ensuring geometric consistency and enabling the generation of dense point clouds. These densified point clouds provide high-quality initial 3D Gaussians, enhancing reconstruction performance. Extensive experiments conducted on synthetic and real-world datasets across various scales validate the effectiveness and practicality of the proposed framework.
Related papers
- UGOD: Uncertainty-Guided Differentiable Opacity and Soft Dropout for Enhanced Sparse-View 3DGS [8.78995910690481]
3D Gaussian Splatting (3DGS) has become a competitive approach for novel view synthesis (NVS)<n>We investigate how adaptive weighting of Gaussians affects rendering quality, which is characterised by learned uncertainties proposed.<n>Our method achieves 3.27% PSNR improvements on the MipNeRF 360 dataset.
arXiv Detail & Related papers (2025-08-07T01:42:22Z) - GDGS: 3D Gaussian Splatting Via Geometry-Guided Initialization And Dynamic Density Control [6.91367883100748]
Gaussian Splatting is an alternative for rendering realistic images while supporting real-time performance.<n>We propose a method to enhance 3D Gaussian Splatting (3DGS)citeKerbl2023, addressing challenges in initialization, optimization, and density control.<n>Our method demonstrates comparable or superior results to state-of-the-art methods, rendering high-fidelity images in real time.
arXiv Detail & Related papers (2025-07-01T01:29:31Z) - Metropolis-Hastings Sampling for 3D Gaussian Reconstruction [24.110069582862465]
We propose an adaptive sampling framework for 3D Gaussian Splatting (3DGS)<n>Our framework overcomes limitations by reformulating densification and pruning as a probabilistic sampling process.<n>Our approach enhances computational efficiency while matching or modestly surpassing the view-synthesis quality of state-of-the-art models.
arXiv Detail & Related papers (2025-06-15T19:12:37Z) - Steepest Descent Density Control for Compact 3D Gaussian Splatting [72.54055499344052]
3D Gaussian Splatting (3DGS) has emerged as a powerful real-time, high-resolution novel view.<n>We propose a theoretical framework that demystifies and improves density control in 3DGS.<n>We introduce SteepGS, incorporating steepest density control, a principled strategy that minimizes loss while maintaining a compact point cloud.
arXiv Detail & Related papers (2025-05-08T18:41:38Z) - ProtoGS: Efficient and High-Quality Rendering with 3D Gaussian Prototypes [81.48624894781257]
3D Gaussian Splatting (3DGS) has made significant strides in novel view synthesis but is limited by the substantial number of Gaussian primitives required.
Recent methods address this issue by compressing the storage size of densified Gaussians, yet fail to preserve rendering quality and efficiency.
We propose ProtoGS to learn Gaussian prototypes to represent Gaussian primitives, significantly reducing the total Gaussian amount without sacrificing visual quality.
arXiv Detail & Related papers (2025-03-21T18:55:14Z) - Uncertainty-Aware Normal-Guided Gaussian Splatting for Surface Reconstruction from Sparse Image Sequences [21.120659841877508]
3D Gaussian Splatting (3DGS) has achieved impressive rendering performance in novel view synthesis.
We propose Uncertainty-aware Normal-Guided Gaussian Splatting (UNG-GS) to quantify geometric uncertainty within the 3DGS pipeline.
UNG-GS significantly outperforms state-of-the-art methods in both sparse and dense sequences.
arXiv Detail & Related papers (2025-03-14T08:18:12Z) - RP-SLAM: Real-time Photorealistic SLAM with Efficient 3D Gaussian Splatting [22.76955981251234]
RP-SLAM is a 3D Gaussian splatting-based vision SLAM method for monocular and RGB-D cameras.<n>It achieves state-of-the-art map rendering accuracy while ensuring real-time performance and model compactness.
arXiv Detail & Related papers (2024-12-13T05:27:35Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.
Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - Implicit Gaussian Splatting with Efficient Multi-Level Tri-Plane Representation [45.582869951581785]
Implicit Gaussian Splatting (IGS) is an innovative hybrid model that integrates explicit point clouds with implicit feature embeddings.
We introduce a level-based progressive training scheme, which incorporates explicit spatial regularization.
Our algorithm can deliver high-quality rendering using only a few MBs, effectively balancing storage efficiency and rendering fidelity.
arXiv Detail & Related papers (2024-08-19T14:34:17Z) - GaussianRoom: Improving 3D Gaussian Splatting with SDF Guidance and Monocular Cues for Indoor Scene Reconstruction [3.043712258792239]
We present a unified framework integrating neural SDF with 3DGS.
This framework incorporates a learnable neural SDF field to guide the densification and pruning of Gaussians.
Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-05-30T03:46:59Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - Mesh-based Gaussian Splatting for Real-time Large-scale Deformation [58.18290393082119]
It is challenging for users to directly deform or manipulate implicit representations with large deformations in the real-time fashion.
We develop a novel GS-based method that enables interactive deformation.
Our approach achieves high-quality reconstruction and effective deformation, while maintaining the promising rendering results at a high frame rate.
arXiv Detail & Related papers (2024-02-07T12:36:54Z) - GPS-Gaussian: Generalizable Pixel-wise 3D Gaussian Splatting for Real-time Human Novel View Synthesis [70.24111297192057]
We present a new approach, termed GPS-Gaussian, for synthesizing novel views of a character in a real-time manner.
The proposed method enables 2K-resolution rendering under a sparse-view camera setting.
arXiv Detail & Related papers (2023-12-04T18:59:55Z) - FSGS: Real-Time Few-shot View Synthesis using Gaussian Splatting [58.41056963451056]
We propose a few-shot view synthesis framework based on 3D Gaussian Splatting.
This framework enables real-time and photo-realistic view synthesis with as few as three training views.
FSGS achieves state-of-the-art performance in both accuracy and rendering efficiency across diverse datasets.
arXiv Detail & Related papers (2023-12-01T09:30:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.