Neural Geometry Processing via Spherical Neural Surfaces
- URL: http://arxiv.org/abs/2407.07755v2
- Date: Mon, 23 Dec 2024 23:56:06 GMT
- Title: Neural Geometry Processing via Spherical Neural Surfaces
- Authors: Romy Williamson, Niloy J. Mitra,
- Abstract summary: We propose a neural surface representation for genus-0 surfaces and demonstrate how to compute geometric core operators directly on this representation.
Our representation is fully seamless, overcoming a key limitation of similar explicit representations such as Neural Surface Maps.
We demonstrate illustrative applications in (neural) spectral analysis, heat flow and mean curvature flow, and evaluate robustness to shape variations.
- Score: 29.30952578277242
- License:
- Abstract: Neural surfaces (e.g., neural map encoding, deep implicits and neural radiance fields) have recently gained popularity because of their generic structure (e.g., multi-layer perceptron) and easy integration with modern learning-based setups. Traditionally, we have a rich toolbox of geometry processing algorithms designed for polygonal meshes to analyze and operate on surface geometry. In the absence of an analogous toolbox, neural representations are typically discretized and converted into a mesh, before applying any geometry processing algorithm. This is unsatisfactory and, as we demonstrate, unnecessary. In this work, we propose a spherical neural surface representation for genus-0 surfaces and demonstrate how to compute core geometric operators directly on this representation. Namely, we estimate surface normals and first and second fundamental forms of the surface, as well as compute surface gradient, surface divergence and Laplace-Beltrami operator on scalar/vector fields defined on the surface. Our representation is fully seamless, overcoming a key limitation of similar explicit representations such as Neural Surface Maps [Morreale et al. 2021]. These operators, in turn, enable geometry processing directly on the neural representations without any unnecessary meshing. We demonstrate illustrative applications in (neural) spectral analysis, heat flow and mean curvature flow, and evaluate robustness to isometric shape variations. We propose theoretical formulations and validate their numerical estimates, against analytical estimates, mesh-based baselines, and neural alternatives, where available. By systematically linking neural surface representations with classical geometry processing algorithms, we believe that this work can become a key ingredient in enabling neural geometry processing. Code will be released upon acceptance, accessible from the project webpage.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Parameterization-driven Neural Surface Reconstruction for Object-oriented Editing in Neural Rendering [35.69582529609475]
This paper introduces a novel neural algorithm for parameterizing neural implicit surfaces to simple parametric domains like spheres and polycubes.
It computes bi-directional deformation between the object and the domain using a forward mapping from the object's zero level set and an inverse deformation for backward mapping.
We demonstrate the method's effectiveness on images of human heads and man-made objects.
arXiv Detail & Related papers (2023-10-09T08:42:40Z) - Neural networks learn to magnify areas near decision boundaries [32.84188052937496]
We study how training shapes the geometry induced by unconstrained neural network feature maps.
We first show that at infinite width, neural networks with random parameters induce highly symmetric metrics on input space.
This symmetry is broken by feature learning: networks trained to perform classification tasks learn to magnify local areas along decision boundaries.
arXiv Detail & Related papers (2023-01-26T19:43:16Z) - NeuralMeshing: Differentiable Meshing of Implicit Neural Representations [63.18340058854517]
We propose a novel differentiable meshing algorithm for extracting surface meshes from neural implicit representations.
Our method produces meshes with regular tessellation patterns and fewer triangle faces compared to existing methods.
arXiv Detail & Related papers (2022-10-05T16:52:25Z) - Minimal Neural Atlas: Parameterizing Complex Surfaces with Minimal
Charts and Distortion [71.52576837870166]
We present Minimal Neural Atlas, a novel atlas-based explicit neural surface representation.
At its core is a fully learnable parametric domain, given by an implicit probabilistic occupancy field defined on an open square of the parametric space.
Our reconstructions are more accurate in terms of the overall geometry, due to the separation of concerns on topology and geometry.
arXiv Detail & Related papers (2022-07-29T16:55:06Z) - Neural Convolutional Surfaces [59.172308741945336]
This work is concerned with a representation of shapes that disentangles fine, local and possibly repeating geometry, from global, coarse structures.
We show that this approach achieves better neural shape compression than the state of the art, as well as enabling manipulation and transfer of shape details.
arXiv Detail & Related papers (2022-04-05T15:40:11Z) - Spelunking the Deep: Guaranteed Queries for General Neural Implicit
Surfaces [35.438964954948574]
This work presents a new approach to perform queries directly on general neural implicit functions for a wide range of existing architectures.
Our key tool is the application of range analysis to neural networks, using automatic arithmetic rules to bound the output of a network over a region.
We use the resulting bounds to develop queries including ray casting, intersection testing, constructing spatial hierarchies, fast mesh extraction, closest-point evaluation.
arXiv Detail & Related papers (2022-02-05T00:37:08Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - Neural Surface Maps [38.172396047006266]
We advocate considering neural networks as encoding surface maps.
We show it is easy to use them to define surfaces via atlases, compose them for surface-to-surface mappings, and optimize differentiable objectives relating to them, such as any notion of distortion, in a trivial manner.
arXiv Detail & Related papers (2021-03-31T09:48:26Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.