On Optimal Sampling for Learning SDF Using MLPs Equipped with Positional
Encoding
- URL: http://arxiv.org/abs/2401.01391v1
- Date: Tue, 2 Jan 2024 10:51:52 GMT
- Title: On Optimal Sampling for Learning SDF Using MLPs Equipped with Positional
Encoding
- Authors: Guying Lin, Lei Yang, Yuan Liu, Congyi Zhang, Junhui Hou, Xiaogang
Jin, Taku Komura, John Keyser, Wenping Wang
- Abstract summary: We devise a tool to determine the appropriate sampling rate for learning an accurate neural implicit field without undesirable side effects.
It is observed that a PE-equipped has an intrinsic frequency much higher than the highest frequency component in the PE layer.
We empirically show in the setting of SDF fitting, this recommended sampling rate is sufficient to secure accurate fitting results.
- Score: 79.67071790034609
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural implicit fields, such as the neural signed distance field (SDF) of a
shape, have emerged as a powerful representation for many applications, e.g.,
encoding a 3D shape and performing collision detection. Typically, implicit
fields are encoded by Multi-layer Perceptrons (MLP) with positional encoding
(PE) to capture high-frequency geometric details. However, a notable side
effect of such PE-equipped MLPs is the noisy artifacts present in the learned
implicit fields. While increasing the sampling rate could in general mitigate
these artifacts, in this paper we aim to explain this adverse phenomenon
through the lens of Fourier analysis. We devise a tool to determine the
appropriate sampling rate for learning an accurate neural implicit field
without undesirable side effects. Specifically, we propose a simple yet
effective method to estimate the intrinsic frequency of a given network with
randomized weights based on the Fourier analysis of the network's responses. It
is observed that a PE-equipped MLP has an intrinsic frequency much higher than
the highest frequency component in the PE layer. Sampling against this
intrinsic frequency following the Nyquist-Sannon sampling theorem allows us to
determine an appropriate training sampling rate. We empirically show in the
setting of SDF fitting that this recommended sampling rate is sufficient to
secure accurate fitting results, while further increasing the sampling rate
would not further noticeably reduce the fitting error. Training PE-equipped
MLPs simply with our sampling strategy leads to performances superior to the
existing methods.
Related papers
- Adaptive Random Fourier Features Training Stabilized By Resampling With Applications in Image Regression [0.8947831206263182]
We present an enhanced adaptive random Fourier features (ARFF) training algorithm for shallow neural networks.
This method uses a particle filter type resampling technique to stabilize the training process and reduce sensitivity to parameter choices.
arXiv Detail & Related papers (2024-10-08T22:08:03Z) - FreSh: Frequency Shifting for Accelerated Neural Representation Learning [11.175745750843484]
Implicit Neural Representations (INRs) have recently gained attention as a powerful approach for continuously representing signals such as images, videos, and 3D shapes using multilayer perceptrons (MLPs)
Low-frequency details are known to exhibit a low-frequency bias, limiting their ability to capture high-frequency details accurately.
We propose frequency shifting (or FreSh) to align the frequency spectrum of the initial output with that of the target signal.
arXiv Detail & Related papers (2024-10-07T14:05:57Z) - REAL Sampling: Boosting Factuality and Diversity of Open-Ended Generation via Asymptotic Entropy [93.8400683020273]
Decoding methods for large language models (LLMs) usually struggle with the tradeoff between ensuring factuality and maintaining diversity.
We propose REAL sampling, a decoding method that improved factuality and diversity over nucleus sampling.
arXiv Detail & Related papers (2024-06-11T21:44:49Z) - TSDF-Sampling: Efficient Sampling for Neural Surface Field using
Truncated Signed Distance Field [9.458310455872438]
This paper introduces a novel approach that substantially reduces the number of samplings by incorporating the Truncated Signed Distance Field (TSDF) of the scene.
Our empirical results show an 11-fold increase in inference speed without compromising performance.
arXiv Detail & Related papers (2023-11-29T18:23:18Z) - Approximate Thompson Sampling via Epistemic Neural Networks [26.872304174606278]
Epistemic neural networks (ENNs) are designed to produce accurate joint predictive distributions.
We show that ENNs serve this purpose well and illustrate how the quality of joint predictive distributions drives performance.
arXiv Detail & Related papers (2023-02-18T01:58:15Z) - PREF: Phasorial Embedding Fields for Compact Neural Representations [54.44527545923917]
We present a phasorial embedding field emphPREF as a compact representation to facilitate neural signal modeling and reconstruction tasks.
Our experiments show PREF-based neural signal processing technique is on par with the state-of-the-art in 2D image completion, 3D SDF surface regression, and 5D radiance field reconstruction.
arXiv Detail & Related papers (2022-05-26T17:43:03Z) - Sigma-Delta and Distributed Noise-Shaping Quantization Methods for
Random Fourier Features [73.25551965751603]
We prove that our quantized RFFs allow a high accuracy approximation of the underlying kernels.
We show that the quantized RFFs can be further compressed, yielding an excellent trade-off between memory use and accuracy.
We empirically show by testing the performance of our methods on several machine learning tasks that our method compares favorably to other state of the art quantization methods in this context.
arXiv Detail & Related papers (2021-06-04T17:24:47Z) - Multi-Scale Positive Sample Refinement for Few-Shot Object Detection [61.60255654558682]
Few-shot object detection (FSOD) helps detectors adapt to unseen classes with few training instances.
We propose a Multi-scale Positive Sample Refinement (MPSR) approach to enrich object scales in FSOD.
MPSR generates multi-scale positive samples as object pyramids and refines the prediction at various scales.
arXiv Detail & Related papers (2020-07-18T09:48:29Z) - Fourier Features Let Networks Learn High Frequency Functions in Low
Dimensional Domains [69.62456877209304]
We show that passing input points through a simple Fourier feature mapping enables a multilayer perceptron to learn high-frequency functions.
Results shed light on advances in computer vision and graphics that achieve state-of-the-art results.
arXiv Detail & Related papers (2020-06-18T17:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.