RayletDF: Raylet Distance Fields for Generalizable 3D Surface Reconstruction from Point Clouds or Gaussians
- URL: http://arxiv.org/abs/2508.09830v1
- Date: Wed, 13 Aug 2025 14:05:21 GMT
- Title: RayletDF: Raylet Distance Fields for Generalizable 3D Surface Reconstruction from Point Clouds or Gaussians
- Authors: Shenxing Wei, Jinxi Li, Yafei Yang, Siyuan Zhou, Bo Yang,
- Abstract summary: We present a generalizable method for 3D surface reconstruction from raw point clouds or pre-estimated 3D Gaussians by 3DGS from RGB images.<n>Unlike existing coordinate-based methods which are often computationally intensive when rendering explicit surfaces, our proposed method, named RayletDF, introduces a new technique called raylet distance field.<n>Our method achieves exceptional generalization ability, successfully recovering 3D surfaces in a single-forward pass across unseen datasets in testing.
- Score: 13.446448045482176
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a generalizable method for 3D surface reconstruction from raw point clouds or pre-estimated 3D Gaussians by 3DGS from RGB images. Unlike existing coordinate-based methods which are often computationally intensive when rendering explicit surfaces, our proposed method, named RayletDF, introduces a new technique called raylet distance field, which aims to directly predict surface points from query rays. Our pipeline consists of three key modules: a raylet feature extractor, a raylet distance field predictor, and a multi-raylet blender. These components work together to extract fine-grained local geometric features, predict raylet distances, and aggregate multiple predictions to reconstruct precise surface points. We extensively evaluate our method on multiple public real-world datasets, demonstrating superior performance in surface reconstruction from point clouds or 3D Gaussians. Most notably, our method achieves exceptional generalization ability, successfully recovering 3D surfaces in a single-forward pass across unseen datasets in testing.
Related papers
- Neural Implicit Representations for 3D Synthetic Aperture Radar Imaging [0.0]
Synthetic aperture radar (SAR) is a tomographic sensor that measures 2D slices of the 3D spatial Fourier transform of the scene.<n>In many operational scenarios, the measured set of 2D slices does not fill the 3D space in the Fourier domain.<n>We review our recent work that achieves state-of-the-art results in 3D SAR imaging employing neural structures to model the surface scattering that dominates SAR returns.
arXiv Detail & Related papers (2026-02-19T17:10:37Z) - LaRI: Layered Ray Intersections for Single-view 3D Geometric Reasoning [75.9814389360821]
layered ray intersections (LaRI) is a new method for unseen geometry reasoning from a single image.<n>Benefiting from the compact and layered representation, LaRI enables complete, efficient, and view-aligned geometric reasoning.<n>We build a complete training data generation pipeline for synthetic and real-world data, including 3D objects and scenes.
arXiv Detail & Related papers (2025-04-25T15:31:29Z) - LineGS : 3D Line Segment Representation on 3D Gaussian Splatting [0.0]
LineGS is a novel method that combines geometry-guided 3D line reconstruction with a 3D Gaussian splatting model.<n>The results show significant improvements in both geometric accuracy and model compactness compared to baseline methods.
arXiv Detail & Related papers (2024-11-30T13:29:36Z) - MonoGSDF: Exploring Monocular Geometric Cues for Gaussian Splatting-Guided Implicit Surface Reconstruction [84.07233691641193]
We introduce MonoGSDF, a novel method that couples primitives with a neural Signed Distance Field (SDF) for high-quality reconstruction.<n>To handle arbitrary-scale scenes, we propose a scaling strategy for robust generalization.<n>Experiments on real-world datasets outperforms prior methods while maintaining efficiency.
arXiv Detail & Related papers (2024-11-25T20:07:07Z) - GeoSplatting: Towards Geometry Guided Gaussian Splatting for Physically-based Inverse Rendering [69.67264955234494]
GeoSplatting is a novel approach that augments 3DGS with explicit geometry guidance for precise light transport modeling.<n>By differentiably constructing a surface-grounded 3DGS from an optimizable mesh, our approach leverages well-defined mesh normals and the opaque mesh surface.<n>This enhancement ensures precise material decomposition while preserving the efficiency and high-quality rendering capabilities of 3DGS.
arXiv Detail & Related papers (2024-10-31T17:57:07Z) - GS-Octree: Octree-based 3D Gaussian Splatting for Robust Object-level 3D Reconstruction Under Strong Lighting [4.255847344539736]
We introduce a novel approach that combines octree-based implicit surface representations with Gaussian splatting.
Our method, which leverages the distribution of 3D Gaussians with SDFs, reconstructs more accurate geometry, particularly in images with specular highlights caused by strong lighting.
arXiv Detail & Related papers (2024-06-26T09:29:56Z) - Normal-guided Detail-Preserving Neural Implicit Function for High-Fidelity 3D Surface Reconstruction [6.4279213810512665]
This paper shows that training neural representations with first-order differential properties (surface normals) leads to highly accurate 3D surface reconstruction.<n>Experiments demonstrate that our method achieves state-of-the-art reconstruction accuracy with a minimal number of views.
arXiv Detail & Related papers (2024-06-07T11:48:47Z) - 3D Neural Edge Reconstruction [61.10201396044153]
We introduce EMAP, a new method for learning 3D edge representations with a focus on both lines and curves.
Our method implicitly encodes 3D edge distance and direction in Unsigned Distance Functions (UDF) from multi-view edge maps.
On top of this neural representation, we propose an edge extraction algorithm that robustly abstracts 3D edges from the inferred edge points and their directions.
arXiv Detail & Related papers (2024-05-29T17:23:51Z) - Gaussian Opacity Fields: Efficient Adaptive Surface Reconstruction in Unbounded Scenes [50.92217884840301]
Gaussian Opacity Fields (GOF) is a novel approach for efficient, high-quality, and adaptive surface reconstruction in scenes.
GOF is derived from ray-tracing-based volume rendering of 3D Gaussians.
GOF surpasses existing 3DGS-based methods in surface reconstruction and novel view synthesis.
arXiv Detail & Related papers (2024-04-16T17:57:19Z) - GIR: 3D Gaussian Inverse Rendering for Relightable Scene Factorization [62.13932669494098]
This paper presents a 3D Gaussian Inverse Rendering (GIR) method, employing 3D Gaussian representations to factorize the scene into material properties, light, and geometry.
We compute the normal of each 3D Gaussian using the shortest eigenvector, with a directional masking scheme forcing accurate normal estimation without external supervision.
We adopt an efficient voxel-based indirect illumination tracing scheme that stores direction-aware outgoing radiance in each 3D Gaussian to disentangle secondary illumination for approximating multi-bounce light transport.
arXiv Detail & Related papers (2023-12-08T16:05:15Z) - NeuSG: Neural Implicit Surface Reconstruction with 3D Gaussian Splatting Guidance [48.72360034876566]
We propose a neural implicit surface reconstruction pipeline with guidance from 3D Gaussian Splatting to recover highly detailed surfaces.<n>The advantage of 3D Gaussian Splatting is that it can generate dense point clouds with detailed structure.<n>We introduce a scale regularizer to pull the centers close to the surface by enforcing the 3D Gaussians to be extremely thin.
arXiv Detail & Related papers (2023-12-01T07:04:47Z) - RayDF: Neural Ray-surface Distance Fields with Multi-view Consistency [10.55497978011315]
We propose a new framework called RayDF to formulate 3D shapes as ray-based neural functions.
Our method achieves a 1000x faster speed than coordinate-based methods to render an 800x800 depth image.
arXiv Detail & Related papers (2023-10-30T15:22:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.