Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion
- URL: http://arxiv.org/abs/2207.14782v1
- Date: Fri, 29 Jul 2022 16:55:06 GMT
- Title: Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion
- Authors: Weng Fei Low, Gim Hee Lee
- Abstract summary: We present Minimal Neural Atlas, a novel atlas-based explicit neural surface representation.
At its core is a fully learnable parametric domain, given by an implicit probabilistic occupancy field defined on an open square of the parametric space.
Our reconstructions are more accurate in terms of the overall geometry, due to the separation of concerns on topology and geometry.
- Score: 71.52576837870166
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Explicit neural surface representations allow for exact and efficient
extraction of the encoded surface at arbitrary precision, as well as analytic
derivation of differential geometric properties such as surface normal and
curvature. Such desirable properties, which are absent in its implicit
counterpart, makes it ideal for various applications in computer vision,
graphics and robotics. However, SOTA works are limited in terms of the topology
it can effectively describe, distortion it introduces to reconstruct complex
surfaces and model efficiency. In this work, we present Minimal Neural Atlas, a
novel atlas-based explicit neural surface representation. At its core is a
fully learnable parametric domain, given by an implicit probabilistic occupancy
field defined on an open square of the parametric space. In contrast, prior
works generally predefine the parametric domain. The added flexibility enables
charts to admit arbitrary topology and boundary. Thus, our representation can
learn a minimal atlas of 3 charts with distortion-minimal parameterization for
surfaces of arbitrary topology, including closed and open surfaces with
arbitrary connected components. Our experiments support the hypotheses and show
that our reconstructions are more accurate in terms of the overall geometry,
due to the separation of concerns on topology and geometry.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Neural Geometry Processing via Spherical Neural Surfaces [29.30952578277242]
We show how to compute core geometric operators directly on a neural surface representation.
These operators, in turn, enable us to create geometry processing tools that act directly on the neural representations.
We demonstrate illustrative applications in (neural) spectral analysis, heat flow and mean curvature flow.
arXiv Detail & Related papers (2024-07-10T15:28:02Z) - Flatten Anything: Unsupervised Neural Surface Parameterization [76.4422287292541]
We introduce the Flatten Anything Model (FAM), an unsupervised neural architecture to achieve global free-boundary surface parameterization.
Compared with previous methods, our FAM directly operates on discrete surface points without utilizing connectivity information.
Our FAM is fully-automated without the need for pre-cutting and can deal with highly-complex topologies.
arXiv Detail & Related papers (2024-05-23T14:39:52Z) - A Level Set Theory for Neural Implicit Evolution under Explicit Flows [102.18622466770114]
Coordinate-based neural networks parameterizing implicit surfaces have emerged as efficient representations of geometry.
We present a framework that allows applying deformation operations defined for triangle meshes onto such implicit surfaces.
We show that our approach exhibits improvements for applications like surface smoothing, mean-curvature flow, inverse rendering and user-defined editing on implicit geometry.
arXiv Detail & Related papers (2022-04-14T17:59:39Z) - Neural Convolutional Surfaces [59.172308741945336]
This work is concerned with a representation of shapes that disentangles fine, local and possibly repeating geometry, from global, coarse structures.
We show that this approach achieves better neural shape compression than the state of the art, as well as enabling manipulation and transfer of shape details.
arXiv Detail & Related papers (2022-04-05T15:40:11Z) - A shallow physics-informed neural network for solving partial
differential equations on surfaces [0.0]
We introduce a mesh-free physics-informed neural network for solving partial differential equations on surfaces.
With the aid of level set function, the surface geometrical quantities, such as the normal and mean curvature of the surface, can be computed directly and used in our surface differential expressions.
With just a few hundred trainable parameters, our network model is able to achieve high predictive accuracy.
arXiv Detail & Related papers (2022-03-03T09:18:21Z) - Deep Networks on Toroids: Removing Symmetries Reveals the Structure of
Flat Regions in the Landscape Geometry [3.712728573432119]
We develop a standardized parameterization in which all symmetries are removed, resulting in a toroidal topology.
We derive a meaningful notion of the flatness of minimizers and of the geodesic paths connecting them.
We also find that minimizers found by variants of gradient descent can be connected by zero-error paths with a single bend.
arXiv Detail & Related papers (2022-02-07T09:57:54Z) - Differential Geometry in Neural Implicits [0.6198237241838558]
We introduce a neural implicit framework that bridges discrete differential geometry of triangle meshes and continuous differential geometry of neural implicit surfaces.
It exploits the differentiable properties of neural networks and the discrete geometry of triangle meshes to approximate them as the zero-level sets of neural implicit functions.
arXiv Detail & Related papers (2022-01-23T13:40:45Z) - Deep Implicit Surface Point Prediction Networks [49.286550880464866]
Deep neural representations of 3D shapes as implicit functions have been shown to produce high fidelity models.
This paper presents a novel approach that models such surfaces using a new class of implicit representations called the closest surface-point (CSP) representation.
arXiv Detail & Related papers (2021-06-10T14:31:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.