Multi-view Normal and Distance Guidance Gaussian Splatting for Surface Reconstruction
- URL: http://arxiv.org/abs/2508.07701v2
- Date: Wed, 13 Aug 2025 15:51:51 GMT
- Title: Multi-view Normal and Distance Guidance Gaussian Splatting for Surface Reconstruction
- Authors: Bo Jia, Yanan Guo, Ying Chang, Benkui Zhang, Ying Xie, Kangning Du, Lin Cao,
- Abstract summary: 3D Gaussian Splatting (3DGS) achieves remarkable results in the field of surface reconstruction.<n>However, when Gaussian normal vectors are aligned within the single-view projection plane, while the geometry appears reasonable in the current view, biases may emerge upon switching to nearby views.<n>We develop a multi-view normal enhancement module, which ensures consistency across views by matching the normals of pixel points in nearby views and calculating the loss.
- Score: 2.760653393100493
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: 3D Gaussian Splatting (3DGS) achieves remarkable results in the field of surface reconstruction. However, when Gaussian normal vectors are aligned within the single-view projection plane, while the geometry appears reasonable in the current view, biases may emerge upon switching to nearby views. To address the distance and global matching challenges in multi-view scenes, we design multi-view normal and distance-guided Gaussian splatting. This method achieves geometric depth unification and high-accuracy reconstruction by constraining nearby depth maps and aligning 3D normals. Specifically, for the reconstruction of small indoor and outdoor scenes, we propose a multi-view distance reprojection regularization module that achieves multi-view Gaussian alignment by computing the distance loss between two nearby views and the same Gaussian surface. Additionally, we develop a multi-view normal enhancement module, which ensures consistency across views by matching the normals of pixel points in nearby views and calculating the loss. Extensive experimental results demonstrate that our method outperforms the baseline in both quantitative and qualitative evaluations, significantly enhancing the surface reconstruction capability of 3DGS. Our code will be made publicly available at (https://github.com/Bistu3DV/MND-GS/).
Related papers
- GVGS: Gaussian Visibility-Aware Multi-View Geometry for Accurate Surface Reconstruction [15.170414649311441]
3D Gaussian Splatting enables efficient optimization and high-quality rendering, yet accurate surface reconstruction remains challenging.<n>We introduce a visibility-aware multi-view geometric consistency constraint that aggregates the visibility of shared Gaussian primitives across views.<n>We also propose a progressive quadtree-calibrated Monocular depth constraint that performs block-wise affine calibration from coarse to fine spatial scales.
arXiv Detail & Related papers (2026-01-28T07:48:51Z) - SPAGS: Sparse-View Articulated Object Reconstruction from Single State via Planar Gaussian Splatting [8.690795471370643]
We propose a category-agnostic articulated object reconstruction framework via planar Gaussian Splatting.<n>Our method achieves higher-fidelity part-level surface reconstruction on both synthetic and real-world data.
arXiv Detail & Related papers (2025-11-21T09:49:53Z) - UniGS: Unified Geometry-Aware Gaussian Splatting for Multimodal Rendering [10.560500427919647]
We propose UniGS, a unified map representation and differentiable attribute reconstruction based on 3D Splatting.<n>Our framework integrates a multimodal viewer capable of rendering photo-realistic RGB images, geometrically accurate depth maps, consistent surface normals, and semantic logits simultaneously.
arXiv Detail & Related papers (2025-10-14T06:07:57Z) - VA-GS: Enhancing the Geometric Representation of Gaussian Splatting via View Alignment [48.147381011235446]
3D Gaussian Splatting has recently emerged as an efficient solution for real-time novel view synthesis.<n>We propose a novel method that enhances the geometric representation of 3D Gaussians through view alignment.<n>Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2025-10-13T14:44:50Z) - MeshSplat: Generalizable Sparse-View Surface Reconstruction via Gaussian Splatting [37.35249331090283]
We propose MeshSplat, a generalizable sparse-view surface reconstruction framework via Gaussian Splatting.<n>Our key idea is to leverage 2DGS as a bridge, which connects novel view synthesis to learned geometric priors.<n>We incorporate a feed-forward network to predict per-view pixel-aligned 2DGS, which enables the network to synthesize novel view images.
arXiv Detail & Related papers (2025-08-25T09:04:20Z) - Surf3R: Rapid Surface Reconstruction from Sparse RGB Views in Seconds [34.38496869014632]
Surf3R is an end-to-end feedforward approach that reconstructs 3D surfaces from sparse views without estimating camera poses.<n>Our method employs a multi-branch and multi-view decoding architecture in which multiple reference views jointly guide the reconstruction process.
arXiv Detail & Related papers (2025-08-06T14:53:42Z) - RDG-GS: Relative Depth Guidance with Gaussian Splatting for Real-time Sparse-View 3D Rendering [13.684624443214599]
We present RDG-GS, a novel sparse-view 3D rendering framework with Relative Depth Guidance based on 3D Gaussian Splatting.<n>The core innovation lies in utilizing relative depth guidance to refine the Gaussian field, steering it towards view-consistent spatial geometric representations.<n>Across extensive experiments on Mip-NeRF360, LLFF, DTU, and Blender, RDG-GS demonstrates state-of-the-art rendering quality and efficiency.
arXiv Detail & Related papers (2025-01-19T16:22:28Z) - MonoGSDF: Exploring Monocular Geometric Cues for Gaussian Splatting-Guided Implicit Surface Reconstruction [84.07233691641193]
We introduce MonoGSDF, a novel method that couples primitives with a neural Signed Distance Field (SDF) for high-quality reconstruction.<n>To handle arbitrary-scale scenes, we propose a scaling strategy for robust generalization.<n>Experiments on real-world datasets outperforms prior methods while maintaining efficiency.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - GPS-Gaussian+: Generalizable Pixel-wise 3D Gaussian Splatting for Real-Time Human-Scene Rendering from Sparse Views [67.34073368933814]
We propose a generalizable Gaussian Splatting approach for high-resolution image rendering under a sparse-view camera setting.
We train our Gaussian parameter regression module on human-only data or human-scene data, jointly with a depth estimation module to lift 2D parameter maps to 3D space.
Experiments on several datasets demonstrate that our method outperforms state-of-the-art methods while achieving an exceeding rendering speed.
arXiv Detail & Related papers (2024-11-18T08:18:44Z) - PF3plat: Pose-Free Feed-Forward 3D Gaussian Splatting [54.7468067660037]
PF3plat sets a new state-of-the-art across all benchmarks, supported by comprehensive ablation studies validating our design choices.<n>Our framework capitalizes on fast speed, scalability, and high-quality 3D reconstruction and view synthesis capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-29T15:28:15Z) - PGSR: Planar-based Gaussian Splatting for Efficient and High-Fidelity Surface Reconstruction [37.14913599050765]
We propose a fast planar-based Gaussian splatting reconstruction representation (PGSR) to achieve high-fidelity surface reconstruction.<n>We then introduce single-view geometric, multi-view photometric, and geometric regularization to preserve global geometric accuracy.<n>Our method achieves fast training and rendering while maintaining high-fidelity rendering and geometric reconstruction, outperforming 3DGS-based and NeRF-based methods.
arXiv Detail & Related papers (2024-06-10T17:59:01Z) - VCR-GauS: View Consistent Depth-Normal Regularizer for Gaussian Surface Reconstruction [47.603017811399624]
We propose a Depth-Normal regularizer that directly couples normal with other geometric parameters, leading to full updates of the geometric parameters from normal regularization.
We also introduce a densification and splitting strategy to regularize the size and distribution of 3D Gaussians for more accurate surface modeling.
arXiv Detail & Related papers (2024-06-09T13:15:43Z) - FreeSplat: Generalizable 3D Gaussian Splatting Towards Free-View Synthesis of Indoor Scenes [50.534213038479926]
FreeSplat is capable of reconstructing geometrically consistent 3D scenes from long sequence input towards free-view synthesis.
We propose a simple but effective free-view training strategy that ensures robust view synthesis across broader view range regardless of the number of views.
arXiv Detail & Related papers (2024-05-28T08:40:14Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.