NeuralODF: Learning Omnidirectional Distance Fields for 3D Shape
Representation
- URL: http://arxiv.org/abs/2206.05837v1
- Date: Sun, 12 Jun 2022 20:59:26 GMT
- Title: NeuralODF: Learning Omnidirectional Distance Fields for 3D Shape
Representation
- Authors: Trevor Houchens, Cheng-You Lu, Shivam Duggal, Rao Fu, Srinath Sridhar
- Abstract summary: In visual computing, 3D geometry is represented in many different forms including meshes, point clouds, voxel grids, level sets, and depth images.
We propose Omni Distance Fields (ODFs), a new 3D shape representation that encodes geometry by storing the depth to the object's surface from any 3D position in any viewing direction.
- Score: 7.208066405543874
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In visual computing, 3D geometry is represented in many different forms
including meshes, point clouds, voxel grids, level sets, and depth images. Each
representation is suited for different tasks thus making the transformation of
one representation into another (forward map) an important and common problem.
We propose Omnidirectional Distance Fields (ODFs), a new 3D shape
representation that encodes geometry by storing the depth to the object's
surface from any 3D position in any viewing direction. Since rays are the
fundamental unit of an ODF, it can be used to easily transform to and from
common 3D representations like meshes or point clouds. Different from level set
methods that are limited to representing closed surfaces, ODFs are unsigned and
can thus model open surfaces (e.g., garments). We demonstrate that ODFs can be
effectively learned with a neural network (NeuralODF) despite the inherent
discontinuities at occlusion boundaries. We also introduce efficient forward
mapping algorithms for transforming ODFs to and from common 3D representations.
Specifically, we introduce an efficient Jumping Cubes algorithm for generating
meshes from ODFs. Experiments demonstrate that NeuralODF can learn to capture
high-quality shape by overfitting to a single object, and also learn to
generalize on common shape categories.
Related papers
- Gradient Distance Function [52.615859148238464]
We show that Gradient Distance Functions (GDFs) can be differentiable at the surface while still being able to represent open surfaces.
This is done by associating to each 3D point a 3D vector whose norm is taken to be the unsigned distance to the surface.
We demonstrate the effectiveness of GDFs on ShapeNet Car, Multi-Garment, and 3D-Scene datasets.
arXiv Detail & Related papers (2024-10-29T18:04:01Z) - Probabilistic Directed Distance Fields for Ray-Based Shape Representations [8.134429779950658]
Directed Distance Fields (DDFs) are a novel neural shape representation that builds upon classical distance fields.
We show how to model inherent discontinuities in the underlying field.
We then apply DDFs to several applications, including single-shape fitting, generative modelling, and single-image 3D reconstruction.
arXiv Detail & Related papers (2024-04-13T21:02:49Z) - UDiFF: Generating Conditional Unsigned Distance Fields with Optimal Wavelet Diffusion [51.31220416754788]
We present UDiFF, a 3D diffusion model for unsigned distance fields (UDFs) which is capable to generate textured 3D shapes with open surfaces from text conditions or unconditionally.
Our key idea is to generate UDFs in spatial-frequency domain with an optimal wavelet transformation, which produces a compact representation space for UDF generation.
arXiv Detail & Related papers (2024-04-10T09:24:54Z) - Unsigned Orthogonal Distance Fields: An Accurate Neural Implicit Representation for Diverse 3D Shapes [29.65562721329593]
In this paper, we introduce a novel neural implicit representation based on unsigned distance fields (UDFs)
In UODFs, the minimal unsigned distance from any spatial point to the shape surface is defined solely in one direction, contrasting with the multi-directional determination made by SDF and UDF.
We verify the effectiveness of UODFs through a range of reconstruction examples, extending from watertight or non-watertight shapes to complex shapes.
arXiv Detail & Related papers (2024-03-03T06:58:35Z) - FIRe: Fast Inverse Rendering using Directional and Signed Distance
Functions [97.5540646069663]
We introduce a novel neural scene representation that we call the directional distance function (DDF)
Our DDF is defined on the unit sphere and predicts the distance to the surface along any given direction.
Based on our DDF, we present a novel fast algorithm (FIRe) to reconstruct 3D shapes given a posed depth map.
arXiv Detail & Related papers (2022-03-30T13:24:04Z) - High-fidelity 3D Model Compression based on Key Spheres [6.59007277780362]
We propose an SDF prediction network using explicit key spheres as input.
Our method achieves the high-fidelity and high-compression 3D object coding and reconstruction.
arXiv Detail & Related papers (2022-01-19T09:21:54Z) - Deep Marching Tetrahedra: a Hybrid Representation for High-Resolution 3D
Shape Synthesis [90.26556260531707]
DMTet is a conditional generative model that can synthesize high-resolution 3D shapes using simple user guides such as coarse voxels.
Unlike deep 3D generative models that directly generate explicit representations such as meshes, our model can synthesize shapes with arbitrary topology.
arXiv Detail & Related papers (2021-11-08T05:29:35Z) - Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D
Shapes [77.6741486264257]
We introduce an efficient neural representation that, for the first time, enables real-time rendering of high-fidelity neural SDFs.
We show that our representation is 2-3 orders of magnitude more efficient in terms of rendering speed compared to previous works.
arXiv Detail & Related papers (2021-01-26T18:50:22Z) - DUDE: Deep Unsigned Distance Embeddings for Hi-Fidelity Representation
of Complex 3D Surfaces [8.104199886760275]
DUDE is a disentangled shape representation that utilizes an unsigned distance field (uDF) to represent proximity to a surface, and a normal vector field (nVF) to represent surface orientation.
We show that a combination of these two (uDF+nVF) can be used to learn high fidelity representations for arbitrary open/closed shapes.
arXiv Detail & Related papers (2020-11-04T22:49:05Z) - Neural Unsigned Distance Fields for Implicit Function Learning [53.241423815726925]
We propose Neural Distance Fields (NDF), a neural network based model which predicts the unsigned distance field for arbitrary 3D shapes.
NDF represent surfaces at high resolutions as prior implicit models, but do not require closed surface data.
NDF can be used for multi-target regression (multiple outputs for one input) with techniques that have been exclusively used for rendering in graphics.
arXiv Detail & Related papers (2020-10-26T22:49:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.