Developability Approximation for Neural Implicits through Rank
Minimization
- URL: http://arxiv.org/abs/2308.03900v3
- Date: Thu, 2 Nov 2023 07:04:17 GMT
- Title: Developability Approximation for Neural Implicits through Rank
Minimization
- Authors: Pratheba Selvaraju
- Abstract summary: This paper introduces a method for reconstructing an approximate developable surface from a neural implicit surface.
The central idea of our method involves incorporating a regularization term that operates on the second-order derivatives of the neural implicits.
We draw inspiration from the properties of surface curvature and employ rank minimization techniques derived from compressed sensing.
- Score: 0.5439020425819
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Developability refers to the process of creating a surface without any
tearing or shearing from a two-dimensional plane. It finds practical
applications in the fabrication industry. An essential characteristic of a
developable 3D surface is its zero Gaussian curvature, which means that either
one or both of the principal curvatures are zero. This paper introduces a
method for reconstructing an approximate developable surface from a neural
implicit surface. The central idea of our method involves incorporating a
regularization term that operates on the second-order derivatives of the neural
implicits, effectively promoting zero Gaussian curvature. Implicit surfaces
offer the advantage of smoother deformation with infinite resolution,
overcoming the high polygonal constraints of state-of-the-art methods using
discrete representations. We draw inspiration from the properties of surface
curvature and employ rank minimization techniques derived from compressed
sensing. Experimental results on both developable and non-developable surfaces,
including those affected by noise, validate the generalizability of our method.
Related papers
- ND-SDF: Learning Normal Deflection Fields for High-Fidelity Indoor Reconstruction [50.07671826433922]
It is non-trivial to simultaneously recover meticulous geometry and preserve smoothness across regions with differing characteristics.
We propose ND-SDF, which learns a Normal Deflection field to represent the angular deviation between the scene normal and the prior normal.
Our method not only obtains smooth weakly textured regions such as walls and floors but also preserves the geometric details of complex structures.
arXiv Detail & Related papers (2024-08-22T17:59:01Z) - Neural-Singular-Hessian: Implicit Neural Representation of Unoriented
Point Clouds by Enforcing Singular Hessian [44.28251558359345]
We propose a new approach for reconstructing surfaces from point clouds.
Our technique aligns the gradients for a near-surface point and its on-surface projection point, producing a rough but faithful shape within just a few iterations.
arXiv Detail & Related papers (2023-09-04T20:10:38Z) - Edge Preserving Implicit Surface Representation of Point Clouds [27.632399836710164]
We propose a novel edge-preserving implicit surface reconstruction method, which mainly consists of a differentiable Laplican regularizer and a dynamic edge sampling strategy.
Compared with the state-of-the-art methods, experimental results show that our method significantly improves the quality of 3D reconstruction results.
arXiv Detail & Related papers (2023-01-12T08:04:47Z) - Neural Volumetric Mesh Generator [40.224769507878904]
We propose Neural Volumetric Mesh Generator(NVMG) which can generate novel and high-quality volumetric meshes.
Our pipeline can generate high-quality artifact-free volumetric and surface meshes from random noise or a reference image without any post-processing.
arXiv Detail & Related papers (2022-10-06T18:46:51Z) - Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion [71.52576837870166]
We present Minimal Neural Atlas, a novel atlas-based explicit neural surface representation.
At its core is a fully learnable parametric domain, given by an implicit probabilistic occupancy field defined on an open square of the parametric space.
Our reconstructions are more accurate in terms of the overall geometry, due to the separation of concerns on topology and geometry.
arXiv Detail & Related papers (2022-07-29T16:55:06Z) - Edge-preserving Near-light Photometric Stereo with Neural Surfaces [76.50065919656575]
We introduce an analytically differentiable neural surface in near-light photometric stereo for avoiding differentiation errors at sharp depth edges.
Experiments on both synthetic and real-world scenes demonstrate the effectiveness of our method for detailed shape recovery with edge preservation.
arXiv Detail & Related papers (2022-07-11T04:51:43Z) - A Level Set Theory for Neural Implicit Evolution under Explicit Flows [102.18622466770114]
Coordinate-based neural networks parameterizing implicit surfaces have emerged as efficient representations of geometry.
We present a framework that allows applying deformation operations defined for triangle meshes onto such implicit surfaces.
We show that our approach exhibits improvements for applications like surface smoothing, mean-curvature flow, inverse rendering and user-defined editing on implicit geometry.
arXiv Detail & Related papers (2022-04-14T17:59:39Z) - Learning Modified Indicator Functions for Surface Reconstruction [10.413340575612233]
We propose a learning-based approach for implicit surface reconstruction from raw point clouds without normals.
Our method is inspired by Gauss Lemma in potential energy theory, which gives an explicit integral formula for the indicator functions.
We design a novel deep neural network to perform surface integral and learn the modified indicator functions from un-oriented and noisy point clouds.
arXiv Detail & Related papers (2021-11-18T05:30:35Z) - Differentiable Surface Rendering via Non-Differentiable Sampling [19.606523934811577]
We present a method for differentiable rendering of 3D surfaces that supports both explicit and implicit representations.
We show for the first time efficient, differentiable rendering of an iso extracted from a neural radiance field (NeRF), and demonstrate surface-based, rather than volume-based, rendering of a NeRF.
arXiv Detail & Related papers (2021-08-10T19:25:06Z) - Pure Exploration in Kernel and Neural Bandits [90.23165420559664]
We study pure exploration in bandits, where the dimension of the feature representation can be much larger than the number of arms.
To overcome the curse of dimensionality, we propose to adaptively embed the feature representation of each arm into a lower-dimensional space.
arXiv Detail & Related papers (2021-06-22T19:51:59Z) - Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks [61.07202852469595]
We present Neural Splines, a technique for 3D surface reconstruction that is based on random feature kernels arising from infinitely-wide shallow ReLU networks.
Our method achieves state-of-the-art results, outperforming recent neural network-based techniques and widely used Poisson Surface Reconstruction.
arXiv Detail & Related papers (2020-06-24T14:54:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.