Matérn Kernels for Tunable Implicit Surface Reconstruction
- URL: http://arxiv.org/abs/2409.15466v1
- Date: Mon, 23 Sep 2024 18:45:42 GMT
- Title: Matérn Kernels for Tunable Implicit Surface Reconstruction
- Authors: Maximilian Weiherer, Bernhard Egger,
- Abstract summary: Mat'ern kernels have some appealing properties which make them particularly well suited for surface reconstruction.
Being stationary, we demonstrate that the Mat'ern kernels' spectrum can be tuned in the same fashion as feature mappings.
We analyze Mat'ern's connection to SIREN networks and its relation to previously employed arc-cosine kernels.
- Score: 5.8691349601057325
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose to use the family of Mat\'ern kernels for tunable implicit surface reconstruction, building upon the recent success of kernel methods for 3D reconstruction of oriented point clouds. As we show, both, from a theoretical and practical perspective, Mat\'ern kernels have some appealing properties which make them particularly well suited for surface reconstruction -- outperforming state-of-the-art methods based on the arc-cosine kernel while being significantly easier to implement, faster to compute, and scaleable. Being stationary, we demonstrate that the Mat\'ern kernels' spectrum can be tuned in the same fashion as Fourier feature mappings help coordinate-based MLPs to overcome spectral bias. Moreover, we theoretically analyze Mat\'ern kernel's connection to SIREN networks as well as its relation to previously employed arc-cosine kernels. Finally, based on recently introduced Neural Kernel Fields, we present data-dependent Mat\'ern kernels and conclude that especially the Laplace kernel (being part of the Mat\'ern family) is extremely competitive, performing almost on par with state-of-the-art methods in the noise-free case while having a more than five times shorter training time.
Related papers
- Correspondence of NNGP Kernel and the Matern Kernel [0.6990493129893112]
We take a practical approach to explore the neural network Gaussian process (NNGP) kernel and its application to data in Gaussian process regression.
We demonstrate the necessity of normalization to produce valid NNGP kernels and explore related numerical challenges.
We then demonstrate a surprising result that the predictions given from the NNGP kernel correspond closely to those given by the Matern kernel under specific circumstances.
arXiv Detail & Related papers (2024-10-10T19:00:05Z) - On the Approximation of Kernel functions [0.0]
The paper addresses approximations of the kernel itself.
For the Hilbert Gauss kernel on the unit cube, the paper establishes an upper bound of the associated eigenfunctions.
This improvement confirms low rank approximation methods such as the Nystr"om method.
arXiv Detail & Related papers (2024-03-11T13:50:07Z) - Neural Fields as Learnable Kernels for 3D Reconstruction [101.54431372685018]
We present a novel method for reconstructing implicit 3D shapes based on a learned kernel ridge regression.
Our technique achieves state-of-the-art results when reconstructing 3D objects and large scenes from sparse oriented points.
arXiv Detail & Related papers (2021-11-26T18:59:04Z) - Temporally-Consistent Surface Reconstruction using Metrically-Consistent
Atlases [131.50372468579067]
We propose a method for unsupervised reconstruction of a temporally-consistent sequence of surfaces from a sequence of time-evolving point clouds.
We represent the reconstructed surfaces as atlases computed by a neural network, which enables us to establish correspondences between frames.
Our approach outperforms state-of-the-art ones on several challenging datasets.
arXiv Detail & Related papers (2021-11-12T17:48:25Z) - Hida-Mat\'ern Kernel [8.594140167290098]
We present the class of Hida-Mat'ern kernels, which is the canonical family of covariance functions over the entire space of stationary Gauss-Markov Processes.
We show how to represent such processes as state space models using only the kernel and its derivatives.
We also show how exploiting special properties of the state space representation enables improved numerical stability in addition to further reductions of computational complexity.
arXiv Detail & Related papers (2021-07-15T03:25:10Z) - Scaling Neural Tangent Kernels via Sketching and Random Features [53.57615759435126]
Recent works report that NTK regression can outperform finitely-wide neural networks trained on small-scale datasets.
We design a near input-sparsity time approximation algorithm for NTK, by sketching the expansions of arc-cosine kernels.
We show that a linear regressor trained on our CNTK features matches the accuracy of exact CNTK on CIFAR-10 dataset while achieving 150x speedup.
arXiv Detail & Related papers (2021-06-15T04:44:52Z) - Random Features for the Neural Tangent Kernel [57.132634274795066]
We propose an efficient feature map construction of the Neural Tangent Kernel (NTK) of fully-connected ReLU network.
We show that dimension of the resulting features is much smaller than other baseline feature map constructions to achieve comparable error bounds both in theory and practice.
arXiv Detail & Related papers (2021-04-03T09:08:12Z) - Neural Splines: Fitting 3D Surfaces with Infinitely-Wide Neural Networks [61.07202852469595]
We present Neural Splines, a technique for 3D surface reconstruction that is based on random feature kernels arising from infinitely-wide shallow ReLU networks.
Our method achieves state-of-the-art results, outperforming recent neural network-based techniques and widely used Poisson Surface Reconstruction.
arXiv Detail & Related papers (2020-06-24T14:54:59Z) - Sparse Gaussian Processes via Parametric Families of Compactly-supported
Kernels [0.6091702876917279]
We propose a method for deriving parametric families of kernel functions with compact support.
The parameters of this family of kernels can be learned from data using maximum likelihood estimation.
We show that these approximations incur minimal error over the exact models when modeling data drawn directly from a target GP.
arXiv Detail & Related papers (2020-06-05T20:44:09Z) - Kernel-Based Reinforcement Learning: A Finite-Time Analysis [53.47210316424326]
We introduce Kernel-UCBVI, a model-based optimistic algorithm that leverages the smoothness of the MDP and a non-parametric kernel estimator of the rewards.
We empirically validate our approach in continuous MDPs with sparse rewards.
arXiv Detail & Related papers (2020-04-12T12:23:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.