A Level Set Theory for Neural Implicit Evolution under Explicit Flows
- URL: http://arxiv.org/abs/2204.07159v1
- Date: Thu, 14 Apr 2022 17:59:39 GMT
- Title: A Level Set Theory for Neural Implicit Evolution under Explicit Flows
- Authors: Ishit Mehta, Manmohan Chandraker, Ravi Ramamoorthi
- Abstract summary: Coordinate-based neural networks parameterizing implicit surfaces have emerged as efficient representations of geometry.
We present a framework that allows applying deformation operations defined for triangle meshes onto such implicit surfaces.
We show that our approach exhibits improvements for applications like surface smoothing, mean-curvature flow, inverse rendering and user-defined editing on implicit geometry.
- Score: 102.18622466770114
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Coordinate-based neural networks parameterizing implicit surfaces have
emerged as efficient representations of geometry. They effectively act as
parametric level sets with the zero-level set defining the surface of interest.
We present a framework that allows applying deformation operations defined for
triangle meshes onto such implicit surfaces. Several of these operations can be
viewed as energy-minimization problems that induce an instantaneous flow field
on the explicit surface. Our method uses the flow field to deform parametric
implicit surfaces by extending the classical theory of level sets. We also
derive a consolidated view for existing methods on differentiable surface
extraction and rendering, by formalizing connections to the level-set theory.
We show that these methods drift from the theory and that our approach exhibits
improvements for applications like surface smoothing, mean-curvature flow,
inverse rendering and user-defined editing on implicit geometry.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Flatten Anything: Unsupervised Neural Surface Parameterization [76.4422287292541]
We introduce the Flatten Anything Model (FAM), an unsupervised neural architecture to achieve global free-boundary surface parameterization.
Compared with previous methods, our FAM directly operates on discrete surface points without utilizing connectivity information.
Our FAM is fully-automated without the need for pre-cutting and can deal with highly-complex topologies.
arXiv Detail & Related papers (2024-05-23T14:39:52Z) - A Theory of Topological Derivatives for Inverse Rendering of Geometry [87.49881303178061]
We introduce a theoretical framework for differentiable surface evolution that allows discrete topology changes through the use of topological derivatives.
We validate the proposed theory with optimization of closed curves in 2D and surfaces in 3D to lend insights into limitations of current methods.
arXiv Detail & Related papers (2023-08-19T00:55:55Z) - Hybrid-CSR: Coupling Explicit and Implicit Shape Representation for
Cortical Surface Reconstruction [28.31844964164312]
Hybrid-CSR is a geometric deep-learning model that combines explicit and implicit shape representations for cortical surface reconstruction.
Our method unifies explicit (oriented point clouds) and implicit (indicator function) cortical surface reconstruction.
arXiv Detail & Related papers (2023-07-23T11:32:14Z) - Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion [71.52576837870166]
We present Minimal Neural Atlas, a novel atlas-based explicit neural surface representation.
At its core is a fully learnable parametric domain, given by an implicit probabilistic occupancy field defined on an open square of the parametric space.
Our reconstructions are more accurate in terms of the overall geometry, due to the separation of concerns on topology and geometry.
arXiv Detail & Related papers (2022-07-29T16:55:06Z) - A shallow physics-informed neural network for solving partial
differential equations on surfaces [0.0]
We introduce a mesh-free physics-informed neural network for solving partial differential equations on surfaces.
With the aid of level set function, the surface geometrical quantities, such as the normal and mean curvature of the surface, can be computed directly and used in our surface differential expressions.
With just a few hundred trainable parameters, our network model is able to achieve high predictive accuracy.
arXiv Detail & Related papers (2022-03-03T09:18:21Z) - DeepCurrents: Learning Implicit Representations of Shapes with
Boundaries [25.317812435426216]
We propose a hybrid shape representation that combines explicit boundary curves with implicit learned interiors.
We further demonstrate learning families of shapes jointly parameterized by boundary curves and latent codes.
arXiv Detail & Related papers (2021-11-17T20:34:20Z) - Deep Implicit Surface Point Prediction Networks [49.286550880464866]
Deep neural representations of 3D shapes as implicit functions have been shown to produce high fidelity models.
This paper presents a novel approach that models such surfaces using a new class of implicit representations called the closest surface-point (CSP) representation.
arXiv Detail & Related papers (2021-06-10T14:31:54Z) - Topology-Adaptive Mesh Deformation for Surface Evolution, Morphing, and
Multi-View Reconstruction [35.01330182954581]
We introduce a new self-intersection removal algorithm, TransforMesh, and we propose a mesh evolution framework based on this algorithm.
We describe two challenging applications, namely surface morphing and 3-D reconstruction.
arXiv Detail & Related papers (2020-12-10T09:26:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.