GausSurf: Geometry-Guided 3D Gaussian Splatting for Surface Reconstruction
- URL: http://arxiv.org/abs/2411.19454v2
- Date: Mon, 02 Dec 2024 05:47:15 GMT
- Title: GausSurf: Geometry-Guided 3D Gaussian Splatting for Surface Reconstruction
- Authors: Jiepeng Wang, Yuan Liu, Peng Wang, Cheng Lin, Junhui Hou, Xin Li, Taku Komura, Wenping Wang,
- Abstract summary: GausSurf employs geometry guidance from multi-view consistency in texture-rich areas and normal priors in texture-less areas of a scene.
Our method surpasses state-of-the-art methods in terms of reconstruction quality and computation time.
- Score: 79.42244344704154
- License:
- Abstract: 3D Gaussian Splatting has achieved impressive performance in novel view synthesis with real-time rendering capabilities. However, reconstructing high-quality surfaces with fine details using 3D Gaussians remains a challenging task. In this work, we introduce GausSurf, a novel approach to high-quality surface reconstruction by employing geometry guidance from multi-view consistency in texture-rich areas and normal priors in texture-less areas of a scene. We observe that a scene can be mainly divided into two primary regions: 1) texture-rich and 2) texture-less areas. To enforce multi-view consistency at texture-rich areas, we enhance the reconstruction quality by incorporating a traditional patch-match based Multi-View Stereo (MVS) approach to guide the geometry optimization in an iterative scheme. This scheme allows for mutual reinforcement between the optimization of Gaussians and patch-match refinement, which significantly improves the reconstruction results and accelerates the training process. Meanwhile, for the texture-less areas, we leverage normal priors from a pre-trained normal estimation model to guide optimization. Extensive experiments on the DTU and Tanks and Temples datasets demonstrate that our method surpasses state-of-the-art methods in terms of reconstruction quality and computation time.
Related papers
- 3D Gaussian Splatting with Normal Information for Mesh Extraction and Improved Rendering [8.59572577251833]
We propose a novel regularization method using the gradients of a signed distance function estimated from the Gaussians.
We demonstrate the effectiveness of our approach on datasets such as Mip-NeRF360, Tanks and Temples, and Deep-Blending.
arXiv Detail & Related papers (2025-01-14T18:40:33Z) - 2DGS-Room: Seed-Guided 2D Gaussian Splatting with Geometric Constrains for High-Fidelity Indoor Scene Reconstruction [3.8879997968084137]
We introduce 2DGS-Room, a novel method leveraging 2D Gaussian Splatting for high-fidelity indoor scene reconstruction.
We employ a seed-guided mechanism to control the distribution of 2D Gaussians, with the density of seed points dynamically optimized through adaptive growth and pruning mechanisms.
To further improve geometric accuracy, we incorporate monocular depth and normal priors to provide constraints for details and textureless regions respectively.
arXiv Detail & Related papers (2024-12-04T16:17:47Z) - Tortho-Gaussian: Splatting True Digital Orthophoto Maps [12.796166971391774]
True Digital Orthophoto Maps (TDOMs) are essential products for digital twins and Geographic Information Systems (GIS)
Traditionally, TDOM generation involves a complex set of traditional photogrammetric process, which may deteriorate due to various challenges.
We introduce TOrtho-Gaussian, a novel method inspired by 3D Gaussian Splatting (3DGS) that generates TDOMs through splatting of optimized anisotropic Gaussian kernel.
arXiv Detail & Related papers (2024-11-29T10:22:38Z) - GUS-IR: Gaussian Splatting with Unified Shading for Inverse Rendering [83.69136534797686]
We present GUS-IR, a novel framework designed to address the inverse rendering problem for complicated scenes featuring rough and glossy surfaces.
This paper starts by analyzing and comparing two prominent shading techniques popularly used for inverse rendering, forward shading and deferred shading.
We propose a unified shading solution that combines the advantages of both techniques for better decomposition.
arXiv Detail & Related papers (2024-11-12T01:51:05Z) - DreamPolish: Domain Score Distillation With Progressive Geometry Generation [66.94803919328815]
We introduce DreamPolish, a text-to-3D generation model that excels in producing refined geometry and high-quality textures.
In the geometry construction phase, our approach leverages multiple neural representations to enhance the stability of the synthesis process.
In the texture generation phase, we introduce a novel score distillation objective, namely domain score distillation (DSD), to guide neural representations toward such a domain.
arXiv Detail & Related papers (2024-11-03T15:15:01Z) - SplatFace: Gaussian Splat Face Reconstruction Leveraging an Optimizable Surface [7.052369521411523]
We present SplatFace, a novel Gaussian splatting framework designed for 3D human face reconstruction without reliance on accurate pre-determined geometry.
Our method is designed to simultaneously deliver both high-quality novel view rendering and accurate 3D mesh reconstructions.
arXiv Detail & Related papers (2024-03-27T17:32:04Z) - 2D Gaussian Splatting for Geometrically Accurate Radiance Fields [50.056790168812114]
3D Gaussian Splatting (3DGS) has recently revolutionized radiance field reconstruction, achieving high quality novel view synthesis and fast rendering speed without baking.
We present 2D Gaussian Splatting (2DGS), a novel approach to model and reconstruct geometrically accurate radiance fields from multi-view images.
We demonstrate that our differentiable terms allows for noise-free and detailed geometry reconstruction while maintaining competitive appearance quality, fast training speed, and real-time rendering.
arXiv Detail & Related papers (2024-03-26T17:21:24Z) - DN-Splatter: Depth and Normal Priors for Gaussian Splatting and Meshing [19.437747560051566]
We propose an adaptive depth loss based on the gradient of color images, improving depth estimation and novel view synthesis results over various baselines.
Our simple yet effective regularization technique enables direct mesh extraction from the Gaussian representation, yielding more physically accurate reconstructions of indoor scenes.
arXiv Detail & Related papers (2024-03-26T16:00:31Z) - Indoor Scene Reconstruction with Fine-Grained Details Using Hybrid Representation and Normal Prior Enhancement [50.56517624931987]
The reconstruction of indoor scenes from multi-view RGB images is challenging due to the coexistence of flat and texture-less regions.
Recent methods leverage neural radiance fields aided by predicted surface normal priors to recover the scene geometry.
This work aims to reconstruct high-fidelity surfaces with fine-grained details by addressing the above limitations.
arXiv Detail & Related papers (2023-09-14T12:05:29Z) - Delicate Textured Mesh Recovery from NeRF via Adaptive Surface
Refinement [78.48648360358193]
We present a novel framework that generates textured surface meshes from images.
Our approach begins by efficiently initializing the geometry and view-dependency appearance with a NeRF.
We jointly refine the appearance with geometry and bake it into texture images for real-time rendering.
arXiv Detail & Related papers (2023-03-03T17:14:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.