RangeUDF: Semantic Surface Reconstruction from 3D Point Clouds
- URL: http://arxiv.org/abs/2204.09138v1
- Date: Tue, 19 Apr 2022 21:39:45 GMT
- Title: RangeUDF: Semantic Surface Reconstruction from 3D Point Clouds
- Authors: Bing Wang, Zhengdi Yu, Bo Yang, Jie Qin, Toby Breckon, Ling Shao, Niki
Trigoni, Andrew Markham
- Abstract summary: We present RangeUDF, a new implicit representation based framework to recover the geometry and semantics of continuous 3D scene surfaces from point clouds.
We show that RangeUDF clearly surpasses state-of-the-art approaches for surface reconstruction on four point cloud datasets.
- Score: 106.54285912111888
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present RangeUDF, a new implicit representation based framework to recover
the geometry and semantics of continuous 3D scene surfaces from point clouds.
Unlike occupancy fields or signed distance fields which can only model closed
3D surfaces, our approach is not restricted to any type of topology. Being
different from the existing unsigned distance fields, our framework does not
suffer from any surface ambiguity. In addition, our RangeUDF can jointly
estimate precise semantics for continuous surfaces. The key to our approach is
a range-aware unsigned distance function together with a surface-oriented
semantic segmentation module. Extensive experiments show that RangeUDF clearly
surpasses state-of-the-art approaches for surface reconstruction on four point
cloud datasets. Moreover, RangeUDF demonstrates superior generalization
capability across multiple unseen datasets, which is nearly impossible for all
existing approaches.
Related papers
- Gradient Distance Function [52.615859148238464]
We show that Gradient Distance Functions (GDFs) can be differentiable at the surface while still being able to represent open surfaces.
This is done by associating to each 3D point a 3D vector whose norm is taken to be the unsigned distance to the surface.
We demonstrate the effectiveness of GDFs on ShapeNet Car, Multi-Garment, and 3D-Scene datasets.
arXiv Detail & Related papers (2024-10-29T18:04:01Z) - Unsigned Orthogonal Distance Fields: An Accurate Neural Implicit Representation for Diverse 3D Shapes [29.65562721329593]
In this paper, we introduce a novel neural implicit representation based on unsigned distance fields (UDFs)
In UODFs, the minimal unsigned distance from any spatial point to the shape surface is defined solely in one direction, contrasting with the multi-directional determination made by SDF and UDF.
We verify the effectiveness of UODFs through a range of reconstruction examples, extending from watertight or non-watertight shapes to complex shapes.
arXiv Detail & Related papers (2024-03-03T06:58:35Z) - DUDF: Differentiable Unsigned Distance Fields with Hyperbolic Scaling [0.20287200280084108]
We learn a hyperbolic scaling of the unsigned distance field, which defines a new Eikonal problem with distinct boundary conditions.
Our approach not only addresses the challenge of open surface representation but also demonstrates significant improvement in reconstruction quality and training performance.
arXiv Detail & Related papers (2024-02-14T00:42:19Z) - NeUDF: Leaning Neural Unsigned Distance Fields with Volume Rendering [25.078149064632218]
NeUDF can reconstruct surfaces with arbitrary topologies solely from multi-view supervision.
We extensively evaluate our method over a number of challenging datasets, including DTU, MGN, and Deep Fashion 3D.
arXiv Detail & Related papers (2023-04-20T04:14:42Z) - GeoUDF: Surface Reconstruction from 3D Point Clouds via Geometry-guided
Distance Representation [73.77505964222632]
We present a learning-based method, namely GeoUDF, to tackle the problem of reconstructing a discrete surface from a sparse point cloud.
To be specific, we propose a geometry-guided learning method for UDF and its gradient estimation.
To extract triangle meshes from the predicted UDF, we propose a customized edge-based marching cube module.
arXiv Detail & Related papers (2022-11-30T06:02:01Z) - NeuralUDF: Learning Unsigned Distance Fields for Multi-view
Reconstruction of Surfaces with Arbitrary Topologies [87.06532943371575]
We present a novel method, called NeuralUDF, for reconstructing surfaces with arbitrary topologies from 2D images via volume rendering.
In this paper, we propose to represent surfaces as the Unsigned Distance Function (UDF) and develop a new volume rendering scheme to learn the neural UDF representation.
arXiv Detail & Related papers (2022-11-25T15:21:45Z) - CAP-UDF: Learning Unsigned Distance Functions Progressively from Raw Point Clouds with Consistency-Aware Field Optimization [54.69408516025872]
CAP-UDF is a novel method to learn consistency-aware UDF from raw point clouds.
We train a neural network to gradually infer the relationship between queries and the approximated surface.
We also introduce a polygonization algorithm to extract surfaces using the gradients of the learned UDF.
arXiv Detail & Related papers (2022-10-06T08:51:08Z) - Learning Anchored Unsigned Distance Functions with Gradient Direction
Alignment for Single-view Garment Reconstruction [92.23666036481399]
We propose a novel learnable Anchored Unsigned Distance Function (AnchorUDF) representation for 3D garment reconstruction from a single image.
AnchorUDF represents 3D shapes by predicting unsigned distance fields (UDFs) to enable open garment surface modeling at arbitrary resolution.
arXiv Detail & Related papers (2021-08-19T03:45:38Z) - MeshSDF: Differentiable Iso-Surface Extraction [45.769838982991736]
We introduce a differentiable way to produce explicit surface mesh representations from Deep Signed Distance Functions.
Our key insight is that by reasoning on how implicit field perturbations impact local surface geometry, one can ultimately differentiate the 3D location of surface samples.
We exploit this to define MeshSDF, an end-to-end differentiable mesh representation which can vary its topology.
arXiv Detail & Related papers (2020-06-06T23:44:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.