PlanarGS: High-Fidelity Indoor 3D Gaussian Splatting Guided by Vision-Language Planar Priors
- URL: http://arxiv.org/abs/2510.23930v1
- Date: Mon, 27 Oct 2025 23:32:19 GMT
- Title: PlanarGS: High-Fidelity Indoor 3D Gaussian Splatting Guided by Vision-Language Planar Priors
- Authors: Xirui Jin, Renbiao Jin, Boying Li, Danping Zou, Wenxian Yu,
- Abstract summary: PlanarGS is a 3DGS-based framework tailored for indoor scene reconstruction.<n>PlanarGS reconstructs accurate and detailed 3D surfaces, consistently outperforming state-of-the-art methods by a large margin.
- Score: 13.825701925456768
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Three-dimensional Gaussian Splatting (3DGS) has recently emerged as an efficient representation for novel-view synthesis, achieving impressive visual quality. However, in scenes dominated by large and low-texture regions, common in indoor environments, the photometric loss used to optimize 3DGS yields ambiguous geometry and fails to recover high-fidelity 3D surfaces. To overcome this limitation, we introduce PlanarGS, a 3DGS-based framework tailored for indoor scene reconstruction. Specifically, we design a pipeline for Language-Prompted Planar Priors (LP3) that employs a pretrained vision-language segmentation model and refines its region proposals via cross-view fusion and inspection with geometric priors. 3D Gaussians in our framework are optimized with two additional terms: a planar prior supervision term that enforces planar consistency, and a geometric prior supervision term that steers the Gaussians toward the depth and normal cues. We have conducted extensive experiments on standard indoor benchmarks. The results show that PlanarGS reconstructs accurate and detailed 3D surfaces, consistently outperforming state-of-the-art methods by a large margin. Project page: https://planargs.github.io
Related papers
- Accurate and Complete Surface Reconstruction from 3D Gaussians via Direct SDF Learning [5.604709769018076]
3D Gaussian Splatting (3DGS) has emerged as a powerful paradigm for photorealistic view synthesis.<n>We propose DiGS, a unified framework that embeds Signed Distance Field (SDF) learning directly into the 3DGS pipeline.<n>We show that DiGS consistently improves reconstruction accuracy and completeness while retaining high fidelity.
arXiv Detail & Related papers (2025-09-09T08:17:46Z) - EG-Gaussian: Epipolar Geometry and Graph Network Enhanced 3D Gaussian Splatting [9.94641948288285]
EG-Gaussian utilizes epipolar geometry and graph networks for 3D scene reconstruction.<n>Our approach significantly improves reconstruction accuracy compared to 3DGS-based methods.
arXiv Detail & Related papers (2025-04-18T08:10:39Z) - Planar Gaussian Splatting [42.74999794635269]
Planar Gaussian Splatting (PGS) is a novel neural rendering approach to learn the 3D geometry and parse the 3D planes of a scene.<n>The PGS achieves state-of-the-art performance in 3D planar reconstruction without requiring either 3D plane labels or depth supervision.
arXiv Detail & Related papers (2024-12-02T19:46:43Z) - AGS-Mesh: Adaptive Gaussian Splatting and Meshing with Geometric Priors for Indoor Room Reconstruction Using Smartphones [19.429461194706786]
We propose an approach for joint surface depth and normal refinement of Gaussian Splatting methods for accurate 3D reconstruction of indoor scenes.<n>Our filtering strategy and optimization design demonstrate significant improvements in both mesh estimation and novel-view synthesis.
arXiv Detail & Related papers (2024-11-28T17:04:32Z) - MonoGSDF: Exploring Monocular Geometric Cues for Gaussian Splatting-Guided Implicit Surface Reconstruction [86.87464903285208]
We introduce MonoGSDF, a novel method that couples primitives with a neural Signed Distance Field (SDF) for high-quality reconstruction.<n>To handle arbitrary-scale scenes, we propose a scaling strategy for robust generalization.<n>Experiments on real-world datasets outperforms prior methods while maintaining efficiency.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - GigaGS: Scaling up Planar-Based 3D Gaussians for Large Scene Surface Reconstruction [71.08607897266045]
3D Gaussian Splatting (3DGS) has shown promising performance in novel view synthesis.
We make the first attempt to tackle the challenging task of large-scale scene surface reconstruction.
We propose GigaGS, the first work for high-quality surface reconstruction for large-scale scenes using 3DGS.
arXiv Detail & Related papers (2024-09-10T17:51:39Z) - Trim 3D Gaussian Splatting for Accurate Geometry Representation [72.00970038074493]
We introduce Trim 3D Gaussian Splatting (TrimGS) to reconstruct accurate 3D geometry from images.
Our experimental and theoretical analyses reveal that a relatively small Gaussian scale is a non-negligible factor in representing and optimizing the intricate details.
When combined with the original 3DGS and the state-of-the-art 2DGS, TrimGS consistently yields more accurate geometry and higher perceptual quality.
arXiv Detail & Related papers (2024-06-11T17:34:46Z) - GaussianRoom: Improving 3D Gaussian Splatting with SDF Guidance and Monocular Cues for Indoor Scene Reconstruction [5.112375652774415]
We propose a unified optimization framework that integrates neural signed distance fields (SDFs) with 3DGS for accurate geometry reconstruction and real-time rendering.<n>Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-05-30T03:46:59Z) - SAGS: Structure-Aware 3D Gaussian Splatting [53.6730827668389]
We propose a structure-aware Gaussian Splatting method (SAGS) that implicitly encodes the geometry of the scene.
SAGS reflects to state-of-the-art rendering performance and reduced storage requirements on benchmark novel-view synthesis datasets.
arXiv Detail & Related papers (2024-04-29T23:26:30Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - GaussianPro: 3D Gaussian Splatting with Progressive Propagation [49.918797726059545]
3DGS relies heavily on the point cloud produced by Structure-from-Motion (SfM) techniques.
We propose a novel method that applies a progressive propagation strategy to guide the densification of the 3D Gaussians.
Our method significantly surpasses 3DGS on the dataset, exhibiting an improvement of 1.15dB in terms of PSNR.
arXiv Detail & Related papers (2024-02-22T16:00:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.