SurfaceSplat: Connecting Surface Reconstruction and Gaussian Splatting
- URL: http://arxiv.org/abs/2507.15602v2
- Date: Mon, 28 Jul 2025 06:27:30 GMT
- Title: SurfaceSplat: Connecting Surface Reconstruction and Gaussian Splatting
- Authors: Zihui Gao, Jia-Wang Bian, Guosheng Lin, Hao Chen, Chunhua Shen,
- Abstract summary: Surface reconstruction and novel view rendering from sparse-view images are challenging.<n>We propose a novel hybrid method that combines the strengths of both approaches.<n>Our method surpasses state-of-the-art approaches in surface reconstruction and novel view synthesis on the DTU and MobileBrick datasets.
- Score: 92.13737830605902
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Surface reconstruction and novel view rendering from sparse-view images are challenging. Signed Distance Function (SDF)-based methods struggle with fine details, while 3D Gaussian Splatting (3DGS)-based approaches lack global geometry coherence. We propose a novel hybrid method that combines the strengths of both approaches: SDF captures coarse geometry to enhance 3DGS-based rendering, while newly rendered images from 3DGS refine the details of SDF for accurate surface reconstruction. As a result, our method surpasses state-of-the-art approaches in surface reconstruction and novel view synthesis on the DTU and MobileBrick datasets. Code will be released at https://github.com/aim-uofa/SurfaceSplat.
Related papers
- GS-2DGS: Geometrically Supervised 2DGS for Reflective Object Reconstruction [51.99776072246151]
We propose a novel reconstruction method called GS-2DGS for reflective objects based on 2D Gaussian Splatting (2DGS)<n> Experimental results on synthetic and real datasets demonstrate that our method significantly outperforms Gaussian-based techniques in terms of reconstruction and relighting.
arXiv Detail & Related papers (2025-06-16T05:40:16Z) - CoSurfGS:Collaborative 3D Surface Gaussian Splatting with Distributed Learning for Large Scene Reconstruction [68.81212850946318]
We propose a multi-agent collaborative fast 3DGS surface reconstruction framework based on distributed learning for large-scale surface reconstruction.<n>Specifically, we develop local model compression (LMC) and model aggregation schemes (MAS) to achieve high-quality surface representation of large scenes.<n>Our proposed method can achieve fast and scalable high-fidelity surface reconstruction and photorealistic rendering.
arXiv Detail & Related papers (2024-12-23T14:31:15Z) - SolidGS: Consolidating Gaussian Surfel Splatting for Sparse-View Surface Reconstruction [48.228533595941556]
We propose a novel method called SolidGS to address this problem.<n>We observed that the reconstructed geometry can be severely inconsistent across multi-views.<n>With the additional help of geometrical regularization and monocular normal estimation, our method achieves superior performance on the sparse view surface reconstruction.
arXiv Detail & Related papers (2024-12-19T21:04:43Z) - MonoGSDF: Exploring Monocular Geometric Cues for Gaussian Splatting-Guided Implicit Surface Reconstruction [84.07233691641193]
We introduce MonoGSDF, a novel method that couples primitives with a neural Signed Distance Field (SDF) for high-quality reconstruction.<n>To handle arbitrary-scale scenes, we propose a scaling strategy for robust generalization.<n>Experiments on real-world datasets outperforms prior methods while maintaining efficiency.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - GSurf: 3D Reconstruction via Signed Distance Fields with Direct Gaussian Supervision [3.2944910942497985]
Surface reconstruction from multi-view images is a core challenge in 3D vision.<n>Recent studies have explored signed distance fields (SDF) within Neural Radiance Fields (NeRF) to achieve high-fidelity surface reconstructions.<n>We introduce GSurf, a novel end-to-end method for learning a signed distance field directly from Gaussian primitives.<n>GSurf achieves faster training and rendering speeds while delivering 3D reconstruction quality comparable to neural implicit surface methods, such as VolSDF and NeuS.
arXiv Detail & Related papers (2024-11-24T05:55:19Z) - GigaGS: Scaling up Planar-Based 3D Gaussians for Large Scene Surface Reconstruction [71.08607897266045]
3D Gaussian Splatting (3DGS) has shown promising performance in novel view synthesis.
We make the first attempt to tackle the challenging task of large-scale scene surface reconstruction.
We propose GigaGS, the first work for high-quality surface reconstruction for large-scale scenes using 3DGS.
arXiv Detail & Related papers (2024-09-10T17:51:39Z) - 3D Gaussian Splatting for Large-scale Surface Reconstruction from Aerial Images [6.076999957937232]
We propose a novel 3DGS-based method for large-scale surface reconstruction using aerial multi-view stereo (MVS) images, named Aerial Gaussian Splatting (AGS)
First, we introduce a data chunking method tailored for large-scale aerial images, making 3DGS feasible for surface reconstruction over extensive scenes.
Second, we integrate the Ray-Gaussian Intersection method into 3DGS to obtain depth and normal information.
arXiv Detail & Related papers (2024-08-31T08:17:24Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - SplatFace: Gaussian Splat Face Reconstruction Leveraging an Optimizable Surface [7.052369521411523]
We present SplatFace, a novel Gaussian splatting framework designed for 3D human face reconstruction without reliance on accurate pre-determined geometry.
Our method is designed to simultaneously deliver both high-quality novel view rendering and accurate 3D mesh reconstructions.
arXiv Detail & Related papers (2024-03-27T17:32:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.