Reg-NF: Efficient Registration of Implicit Surfaces within Neural Fields
- URL: http://arxiv.org/abs/2402.09722v1
- Date: Thu, 15 Feb 2024 05:31:03 GMT
- Title: Reg-NF: Efficient Registration of Implicit Surfaces within Neural Fields
- Authors: Stephen Hausler, David Hall, Sutharsan Mahendren and Peyman Moghadam
- Abstract summary: We present Reg-NF, a neural fields-based registration that optimises for the relative 6-DoF transformation between two arbitrary neural fields.
Key components of Reg-NF include a bidirectional registration loss, multi-view surface sampling, and utilisation of volumetric signed distance functions.
- Score: 6.949522577812908
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural fields, coordinate-based neural networks, have recently gained
popularity for implicitly representing a scene. In contrast to classical
methods that are based on explicit representations such as point clouds, neural
fields provide a continuous scene representation able to represent 3D geometry
and appearance in a way which is compact and ideal for robotics applications.
However, limited prior methods have investigated registering multiple neural
fields by directly utilising these continuous implicit representations. In this
paper, we present Reg-NF, a neural fields-based registration that optimises for
the relative 6-DoF transformation between two arbitrary neural fields, even if
those two fields have different scale factors. Key components of Reg-NF include
a bidirectional registration loss, multi-view surface sampling, and utilisation
of volumetric signed distance functions (SDFs). We showcase our approach on a
new neural field dataset for evaluating registration problems. We provide an
exhaustive set of experiments and ablation studies to identify the performance
of our approach, while also discussing limitations to provide future direction
to the research community on open challenges in utilizing neural fields in
unconstrained environments.
Related papers
- Deep Learning on Object-centric 3D Neural Fields [19.781070751341154]
We introduce nf2vec, a framework capable of generating a compact latent representation for an input NF in a single inference pass.
We demonstrate that nf2vec effectively embeds 3D objects represented by the input NFs and showcase how the resulting embeddings can be employed in deep learning pipelines.
arXiv Detail & Related papers (2023-12-20T18:56:45Z) - Neural Processing of Tri-Plane Hybrid Neural Fields [20.78031512517053]
We show that the tri-plane discrete data structure encodes rich information, which can be effectively processed by standard deep-learning machinery.
While processing a field with the same reconstruction quality, we achieve task performance far superior to frameworks that process larges representations.
arXiv Detail & Related papers (2023-10-02T12:27:22Z) - Generalizable Neural Fields as Partially Observed Neural Processes [16.202109517569145]
We propose a new paradigm that views the large-scale training of neural representations as a part of a partially-observed neural process framework.
We demonstrate that this approach outperforms both state-of-the-art gradient-based meta-learning approaches and hypernetwork approaches.
arXiv Detail & Related papers (2023-09-13T01:22:16Z) - ResFields: Residual Neural Fields for Spatiotemporal Signals [61.44420761752655]
ResFields is a novel class of networks specifically designed to effectively represent complex temporal signals.
We conduct comprehensive analysis of the properties of ResFields and propose a matrix factorization technique to reduce the number of trainable parameters.
We demonstrate the practical utility of ResFields by showcasing its effectiveness in capturing dynamic 3D scenes from sparse RGBD cameras.
arXiv Detail & Related papers (2023-09-06T16:59:36Z) - Neural Vector Fields: Generalizing Distance Vector Fields by Codebooks
and Zero-Curl Regularization [73.3605319281966]
We propose a novel 3D representation, Neural Vector Fields (NVF), which adopts the explicit learning process to manipulate meshes and implicit unsigned distance function (UDF) representation to break the barriers in resolution and topology.
We evaluate both NVFs on four surface reconstruction scenarios, including watertight vs non-watertight shapes, category-agnostic reconstruction vs category-unseen reconstruction, category-specific, and cross-domain reconstruction.
arXiv Detail & Related papers (2023-09-04T10:42:56Z) - nerf2nerf: Pairwise Registration of Neural Radiance Fields [38.13011152344739]
We introduce a technique for pairwise registration of neural fields that extends classical optimization-based local registration.
We introduce the concept of a ''surface field'' -- a field distilled from a pre-trained NeRF model.
We evaluate the effectiveness of our technique by introducing a dataset of pre-trained NeRF scenes.
arXiv Detail & Related papers (2022-11-03T06:04:59Z) - Zonotope Domains for Lagrangian Neural Network Verification [102.13346781220383]
We decompose the problem of verifying a deep neural network into the verification of many 2-layer neural networks.
Our technique yields bounds that improve upon both linear programming and Lagrangian-based verification techniques.
arXiv Detail & Related papers (2022-10-14T19:31:39Z) - TT-NF: Tensor Train Neural Fields [88.49847274083365]
We introduce a novel low-rank representation termed Train Neural Fields (TT-NF) for learning fields on regular grids.
We analyze the effect of low-rank compression on the downstream task quality metrics.
arXiv Detail & Related papers (2022-09-30T15:17:39Z) - Neural Fields in Visual Computing and Beyond [54.950885364735804]
Recent advances in machine learning have created increasing interest in solving visual computing problems using coordinate-based neural networks.
neural fields have seen successful application in the synthesis of 3D shapes and image, animation of human bodies, 3D reconstruction, and pose estimation.
This report provides context, mathematical grounding, and an extensive review of literature on neural fields.
arXiv Detail & Related papers (2021-11-22T18:57:51Z) - Bounding The Number of Linear Regions in Local Area for Neural Networks
with ReLU Activations [6.4817648240626005]
We present the first method to estimate the upper bound of the number of linear regions in any sphere in the input space of a given ReLU neural network.
Our experiments showed that, while training a neural network, the boundaries of the linear regions tend to move away from the training data points.
arXiv Detail & Related papers (2020-07-14T04:06:00Z) - Modeling from Features: a Mean-field Framework for Over-parameterized
Deep Neural Networks [54.27962244835622]
This paper proposes a new mean-field framework for over- parameterized deep neural networks (DNNs)
In this framework, a DNN is represented by probability measures and functions over its features in the continuous limit.
We illustrate the framework via the standard DNN and the Residual Network (Res-Net) architectures.
arXiv Detail & Related papers (2020-07-03T01:37:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.