Marching Neurons: Accurate Surface Extraction for Neural Implicit Shapes
- URL: http://arxiv.org/abs/2509.21007v1
- Date: Thu, 25 Sep 2025 11:06:42 GMT
- Title: Marching Neurons: Accurate Surface Extraction for Neural Implicit Shapes
- Authors: Christian Stippel, Felix Mujkanovic, Thomas Leimkühler, Pedro Hermosilla,
- Abstract summary: We introduce a novel approach for analytically extracting surfaces from neural implicit functions.<n>Our method operates in parallel and can navigate large neural architectures.<n>The resulting meshes faithfully capture the full geometric information from the network without ad-hoc spatial discretization.
- Score: 14.372634421912094
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Accurate surface geometry representation is crucial in 3D visual computing. Explicit representations, such as polygonal meshes, and implicit representations, like signed distance functions, each have distinct advantages, making efficient conversions between them increasingly important. Conventional surface extraction methods for implicit representations, such as the widely used Marching Cubes algorithm, rely on spatial decomposition and sampling, leading to inaccuracies due to fixed and limited resolution. We introduce a novel approach for analytically extracting surfaces from neural implicit functions. Our method operates natively in parallel and can navigate large neural architectures. By leveraging the fact that each neuron partitions the domain, we develop a depth-first traversal strategy to efficiently track the encoded surface. The resulting meshes faithfully capture the full geometric information from the network without ad-hoc spatial discretization, achieving unprecedented accuracy across diverse shapes and network architectures while maintaining competitive speed.
Related papers
- Vertex Features for Neural Global Illumination [21.57826395764302]
We present neural features, a generalized formulation of learnable representation for neural rendering tasks involving explicit mesh surfaces.<n>We validate our neural representation across diverse neural rendering tasks, with a specific emphasis on neural radiosity.
arXiv Detail & Related papers (2025-08-11T11:10:19Z) - Optimizing 3D Geometry Reconstruction from Implicit Neural Representations [2.3940819037450987]
Implicit neural representations have emerged as a powerful tool in learning 3D geometry.
We present a novel approach that both reduces computational expenses and enhances the capture of fine details.
arXiv Detail & Related papers (2024-10-16T16:36:23Z) - HYVE: Hybrid Vertex Encoder for Neural Distance Fields [9.40036617308303]
We present a neural-network architecture suitable for accurate encoding of 3D shapes in a single forward pass.
Our network is able to output valid signed distance fields without explicit prior knowledge of non-zero distance values or shape occupancy.
arXiv Detail & Related papers (2023-10-10T14:07:37Z) - GraphCSPN: Geometry-Aware Depth Completion via Dynamic GCNs [49.55919802779889]
We propose a Graph Convolution based Spatial Propagation Network (GraphCSPN) as a general approach for depth completion.
In this work, we leverage convolution neural networks as well as graph neural networks in a complementary way for geometric representation learning.
Our method achieves the state-of-the-art performance, especially when compared in the case of using only a few propagation steps.
arXiv Detail & Related papers (2022-10-19T17:56:03Z) - Neural Convolutional Surfaces [59.172308741945336]
This work is concerned with a representation of shapes that disentangles fine, local and possibly repeating geometry, from global, coarse structures.
We show that this approach achieves better neural shape compression than the state of the art, as well as enabling manipulation and transfer of shape details.
arXiv Detail & Related papers (2022-04-05T15:40:11Z) - Deep Implicit Surface Point Prediction Networks [49.286550880464866]
Deep neural representations of 3D shapes as implicit functions have been shown to produce high fidelity models.
This paper presents a novel approach that models such surfaces using a new class of implicit representations called the closest surface-point (CSP) representation.
arXiv Detail & Related papers (2021-06-10T14:31:54Z) - Neural Geometric Level of Detail: Real-time Rendering with Implicit 3D
Shapes [77.6741486264257]
We introduce an efficient neural representation that, for the first time, enables real-time rendering of high-fidelity neural SDFs.
We show that our representation is 2-3 orders of magnitude more efficient in terms of rendering speed compared to previous works.
arXiv Detail & Related papers (2021-01-26T18:50:22Z) - Iso-Points: Optimizing Neural Implicit Surfaces with Hybrid
Representations [21.64457003420851]
We develop a hybrid neural surface representation that allows us to impose geometry-aware sampling and regularization.
We demonstrate that our method can be adopted to improve techniques for reconstructing neural implicit surfaces from multi-view images or point clouds.
arXiv Detail & Related papers (2020-12-11T15:51:04Z) - DiffusionNet: Discretization Agnostic Learning on Surfaces [48.658589779470454]
We introduce a new approach to deep learning on 3D surfaces, based on the insight that a simple diffusion layer is highly effective for spatial communication.
The resulting networks automatically generalize across different samplings and resolutions of a surface.
We focus primarily on triangle mesh surfaces, and demonstrate state-of-the-art results for a variety of tasks including surface classification, segmentation, and non-rigid correspondence.
arXiv Detail & Related papers (2020-12-01T23:24:22Z) - Primal-Dual Mesh Convolutional Neural Networks [62.165239866312334]
We propose a primal-dual framework drawn from the graph-neural-network literature to triangle meshes.
Our method takes features for both edges and faces of a 3D mesh as input and dynamically aggregates them.
We provide theoretical insights of our approach using tools from the mesh-simplification literature.
arXiv Detail & Related papers (2020-10-23T14:49:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.