NeuSG: Neural Implicit Surface Reconstruction with 3D Gaussian Splatting
Guidance
- URL: http://arxiv.org/abs/2312.00846v1
- Date: Fri, 1 Dec 2023 07:04:47 GMT
- Title: NeuSG: Neural Implicit Surface Reconstruction with 3D Gaussian Splatting
Guidance
- Authors: Hanlin Chen, Chen Li, Gim Hee Lee
- Abstract summary: We propose a neural implicit surface reconstruction pipeline with guidance from 3D Gaussian Splatting to recover highly detailed surfaces.
The advantage of 3D Gaussian Splatting is that it can generate dense point clouds with detailed structure.
We introduce a scale regularizer to pull the centers close to the surface by enforcing the 3D Gaussians to be extremely thin.
- Score: 59.08521048003009
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Existing neural implicit surface reconstruction methods have achieved
impressive performance in multi-view 3D reconstruction by leveraging explicit
geometry priors such as depth maps or point clouds as regularization. However,
the reconstruction results still lack fine details because of the over-smoothed
depth map or sparse point cloud. In this work, we propose a neural implicit
surface reconstruction pipeline with guidance from 3D Gaussian Splatting to
recover highly detailed surfaces. The advantage of 3D Gaussian Splatting is
that it can generate dense point clouds with detailed structure. Nonetheless, a
naive adoption of 3D Gaussian Splatting can fail since the generated points are
the centers of 3D Gaussians that do not necessarily lie on the surface. We thus
introduce a scale regularizer to pull the centers close to the surface by
enforcing the 3D Gaussians to be extremely thin. Moreover, we propose to refine
the point cloud from 3D Gaussians Splatting with the normal priors from the
surface predicted by neural implicit models instead of using a fixed set of
points as guidance. Consequently, the quality of surface reconstruction
improves from the guidance of the more accurate 3D Gaussian splatting. By
jointly optimizing the 3D Gaussian Splatting and the neural implicit model, our
approach benefits from both representations and generates complete surfaces
with intricate details. Experiments on Tanks and Temples verify the
effectiveness of our proposed method.
Related papers
- Neural Signed Distance Function Inference through Splatting 3D Gaussians Pulled on Zero-Level Set [49.780302894956776]
It is vital to infer a signed distance function (SDF) in multi-view based surface reconstruction.
We propose a method that seamlessly merge 3DGS with the learning of neural SDFs.
Our numerical and visual comparisons show our superiority over the state-of-the-art results on the widely used benchmarks.
arXiv Detail & Related papers (2024-10-18T05:48:06Z) - MeshGS: Adaptive Mesh-Aligned Gaussian Splatting for High-Quality Rendering [61.64903786502728]
We propose a novel approach that integrates mesh representation with 3D Gaussian splats to perform high-quality rendering of reconstructed real-world scenes.
We consider the distance between each Gaussian splat and the mesh surface to distinguish between tightly-bound and loosely-bound splats.
Our method surpasses recent mesh-based neural rendering techniques by achieving a 2dB higher PSNR, and outperforms mesh-based Gaussian splatting methods by 1.3 dB PSNR.
arXiv Detail & Related papers (2024-10-11T16:07:59Z) - RaDe-GS: Rasterizing Depth in Gaussian Splatting [32.38730602146176]
Gaussian Splatting (GS) has proven to be highly effective in novel view synthesis, achieving high-quality and real-time rendering.
Our work introduces a Chamfer distance error comparable to NeuraLangelo on the DTU dataset and maintains similar computational efficiency as the original 3D GS methods.
arXiv Detail & Related papers (2024-06-03T15:56:58Z) - GaussianRoom: Improving 3D Gaussian Splatting with SDF Guidance and Monocular Cues for Indoor Scene Reconstruction [3.043712258792239]
We present a unified framework integrating neural SDF with 3DGS.
This framework incorporates a learnable neural SDF field to guide the densification and pruning of Gaussians.
Our method achieves state-of-the-art performance in both surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-05-30T03:46:59Z) - High-quality Surface Reconstruction using Gaussian Surfels [18.51978059665113]
We propose a novel point-based representation, Gaussian surfels, to combine the advantages of the flexible optimization procedure in 3D Gaussian points.
This is achieved by setting the z-scale of 3D Gaussian points to 0, effectively flattening the original 3D ellipsoid into a 2D ellipse.
By treating the local z-axis as the normal direction, it greatly improves optimization stability and surface alignment.
arXiv Detail & Related papers (2024-04-27T04:13:39Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - 3DGSR: Implicit Surface Reconstruction with 3D Gaussian Splatting [58.95801720309658]
In this paper, we present an implicit surface reconstruction method with 3D Gaussian Splatting (3DGS), namely 3DGSR.
The key insight is incorporating an implicit signed distance field (SDF) within 3D Gaussians to enable them to be aligned and jointly optimized.
Our experimental results demonstrate that our 3DGSR method enables high-quality 3D surface reconstruction while preserving the efficiency and rendering quality of 3DGS.
arXiv Detail & Related papers (2024-03-30T16:35:38Z) - 2D Gaussian Splatting for Geometrically Accurate Radiance Fields [50.056790168812114]
3D Gaussian Splatting (3DGS) has recently revolutionized radiance field reconstruction, achieving high quality novel view synthesis and fast rendering speed without baking.
We present 2D Gaussian Splatting (2DGS), a novel approach to model and reconstruct geometrically accurate radiance fields from multi-view images.
We demonstrate that our differentiable terms allows for noise-free and detailed geometry reconstruction while maintaining competitive appearance quality, fast training speed, and real-time rendering.
arXiv Detail & Related papers (2024-03-26T17:21:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.