NeUDF: Leaning Neural Unsigned Distance Fields with Volume Rendering
- URL: http://arxiv.org/abs/2304.10080v1
- Date: Thu, 20 Apr 2023 04:14:42 GMT
- Title: NeUDF: Leaning Neural Unsigned Distance Fields with Volume Rendering
- Authors: Yu-Tao Liu, Li Wang, Jie yang, Weikai Chen, Xiaoxu Meng, Bo Yang, Lin
Gao
- Abstract summary: NeUDF can reconstruct surfaces with arbitrary topologies solely from multi-view supervision.
We extensively evaluate our method over a number of challenging datasets, including DTU, MGN, and Deep Fashion 3D.
- Score: 25.078149064632218
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-view shape reconstruction has achieved impressive progresses thanks to
the latest advances in neural implicit surface rendering. However, existing
methods based on signed distance function (SDF) are limited to closed surfaces,
failing to reconstruct a wide range of real-world objects that contain
open-surface structures. In this work, we introduce a new neural rendering
framework, coded NeUDF, that can reconstruct surfaces with arbitrary topologies
solely from multi-view supervision. To gain the flexibility of representing
arbitrary surfaces, NeUDF leverages the unsigned distance function (UDF) as
surface representation. While a naive extension of an SDF-based neural renderer
cannot scale to UDF, we propose two new formulations of weight function
specially tailored for UDF-based volume rendering. Furthermore, to cope with
open surface rendering, where the in/out test is no longer valid, we present a
dedicated normal regularization strategy to resolve the surface orientation
ambiguity. We extensively evaluate our method over a number of challenging
datasets, including DTU}, MGN, and Deep Fashion 3D. Experimental results
demonstrate that nEudf can significantly outperform the state-of-the-art method
in the task of multi-view surface reconstruction, especially for complex shapes
with open boundaries.
Related papers
- NeuRodin: A Two-stage Framework for High-Fidelity Neural Surface Reconstruction [63.85586195085141]
Signed Distance Function (SDF)-based volume rendering has demonstrated significant capabilities in surface reconstruction.
We introduce NeuRodin, a novel two-stage neural surface reconstruction framework.
NeuRodin achieves high-fidelity surface reconstruction and retains the flexible optimization characteristics of density-based methods.
arXiv Detail & Related papers (2024-08-19T17:36:35Z) - NeAT: Learning Neural Implicit Surfaces with Arbitrary Topologies from
Multi-view Images [17.637064969966847]
NeAT is a new neural rendering framework that learns implicit surfaces with arbitrary topologies from multi-view images.
NeAT supports easy field-to-mesh conversion using the classic Marching Cubes algorithm.
Our approach is able to faithfully reconstruct both watertight and non-watertight surfaces.
arXiv Detail & Related papers (2023-03-21T16:49:41Z) - NeuralUDF: Learning Unsigned Distance Fields for Multi-view
Reconstruction of Surfaces with Arbitrary Topologies [87.06532943371575]
We present a novel method, called NeuralUDF, for reconstructing surfaces with arbitrary topologies from 2D images via volume rendering.
In this paper, we propose to represent surfaces as the Unsigned Distance Function (UDF) and develop a new volume rendering scheme to learn the neural UDF representation.
arXiv Detail & Related papers (2022-11-25T15:21:45Z) - Recovering Fine Details for Neural Implicit Surface Reconstruction [3.9702081347126943]
We present D-NeuS, a volume rendering neural implicit surface reconstruction method capable to recover fine geometry details.
We impose multi-view feature consistency on the surface points, derived by interpolating SDF zero-crossings from sampled points along rays.
Our method reconstructs high-accuracy surfaces with details, and outperforms the state of the art.
arXiv Detail & Related papers (2022-11-21T10:06:09Z) - RangeUDF: Semantic Surface Reconstruction from 3D Point Clouds [106.54285912111888]
We present RangeUDF, a new implicit representation based framework to recover the geometry and semantics of continuous 3D scene surfaces from point clouds.
We show that RangeUDF clearly surpasses state-of-the-art approaches for surface reconstruction on four point cloud datasets.
arXiv Detail & Related papers (2022-04-19T21:39:45Z) - Learning Anchored Unsigned Distance Functions with Gradient Direction
Alignment for Single-view Garment Reconstruction [92.23666036481399]
We propose a novel learnable Anchored Unsigned Distance Function (AnchorUDF) representation for 3D garment reconstruction from a single image.
AnchorUDF represents 3D shapes by predicting unsigned distance fields (UDFs) to enable open garment surface modeling at arbitrary resolution.
arXiv Detail & Related papers (2021-08-19T03:45:38Z) - NeuS: Learning Neural Implicit Surfaces by Volume Rendering for
Multi-view Reconstruction [88.02850205432763]
We present a novel neural surface reconstruction method, called NeuS, for reconstructing objects and scenes with high fidelity from 2D image inputs.
Existing neural surface reconstruction approaches, such as DVR and IDR, require foreground mask as supervision.
We observe that the conventional volume rendering method causes inherent geometric errors for surface reconstruction.
We propose a new formulation that is free of bias in the first order of approximation, thus leading to more accurate surface reconstruction even without the mask supervision.
arXiv Detail & Related papers (2021-06-20T12:59:42Z) - Neural Unsigned Distance Fields for Implicit Function Learning [53.241423815726925]
We propose Neural Distance Fields (NDF), a neural network based model which predicts the unsigned distance field for arbitrary 3D shapes.
NDF represent surfaces at high resolutions as prior implicit models, but do not require closed surface data.
NDF can be used for multi-target regression (multiple outputs for one input) with techniques that have been exclusively used for rendering in graphics.
arXiv Detail & Related papers (2020-10-26T22:49:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.