Online Neural Path Guiding with Normalized Anisotropic Spherical
Gaussians
- URL: http://arxiv.org/abs/2303.08064v2
- Date: Tue, 27 Feb 2024 05:32:03 GMT
- Title: Online Neural Path Guiding with Normalized Anisotropic Spherical
Gaussians
- Authors: Jiawei Huang, Akito Iizuka, Hajime Tanaka, Taku Komura, Yoshifumi
Kitamura
- Abstract summary: We propose a novel online framework to learn the spatial-varying density model with a single small neural network.
Our framework learns the distribution in a progressive manner and does not need any warm-up phases.
- Score: 20.68953631807367
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The variance reduction speed of physically-based rendering is heavily
affected by the adopted importance sampling technique. In this paper we propose
a novel online framework to learn the spatial-varying density model with a
single small neural network using stochastic ray samples. To achieve this task,
we propose a novel closed-form density model called the normalized anisotropic
spherical gaussian mixture, that can express complex irradiance fields with a
small number of parameters. Our framework learns the distribution in a
progressive manner and does not need any warm-up phases. Due to the compact and
expressive representation of our density model, our framework can be
implemented entirely on the GPU, allowing it produce high quality images with
limited computational resources.
Related papers
- Don't Splat your Gaussians: Volumetric Ray-Traced Primitives for Modeling and Rendering Scattering and Emissive Media [8.792248506305937]
We formalize and generalize the modeling of scattering and emissive media using mixtures of simple kernel-based volumetric primitives.
We demonstrate our method as a compact and efficient alternative to other forms of volume modeling for forward and inverse rendering of scattering media.
We also introduce the Epanechnikov kernel and demonstrate its potential as an efficient alternative to the traditionally-used Gaussian kernel in scene reconstruction tasks.
arXiv Detail & Related papers (2024-05-24T10:42:05Z) - Binary Opacity Grids: Capturing Fine Geometric Detail for Mesh-Based
View Synthesis [70.40950409274312]
We modify density fields to encourage them to converge towards surfaces, without compromising their ability to reconstruct thin structures.
We also develop a fusion-based meshing strategy followed by mesh simplification and appearance model fitting.
The compact meshes produced by our model can be rendered in real-time on mobile devices.
arXiv Detail & Related papers (2024-02-19T18:59:41Z) - Adaptive Shells for Efficient Neural Radiance Field Rendering [92.18962730460842]
We propose a neural radiance formulation that smoothly transitions between- and surface-based rendering.
Our approach enables efficient rendering at very high fidelity.
We also demonstrate that the extracted envelope enables downstream applications such as animation and simulation.
arXiv Detail & Related papers (2023-11-16T18:58:55Z) - Generative Neural Fields by Mixtures of Neural Implicit Functions [43.27461391283186]
We propose a novel approach to learning the generative neural fields represented by linear combinations of implicit basis networks.
Our algorithm learns basis networks in the form of implicit neural representations and their coefficients in a latent space by either conducting meta-learning or adopting auto-decoding paradigms.
arXiv Detail & Related papers (2023-10-30T11:41:41Z) - Adaptive Multi-NeRF: Exploit Efficient Parallelism in Adaptive Multiple
Scale Neural Radiance Field Rendering [3.8200916793910973]
Recent advances in Neural Radiance Fields (NeRF) have demonstrated significant potential for representing 3D scene appearances as implicit neural networks.
However, the lengthy training and rendering process hinders the widespread adoption of this promising technique for real-time rendering applications.
We present an effective adaptive multi-NeRF method designed to accelerate the neural rendering process for large scenes.
arXiv Detail & Related papers (2023-10-03T08:34:49Z) - Truly Mesh-free Physics-Informed Neural Networks [3.5611181253285253]
Physics-informed Neural Networks (PINNs) have recently emerged as a principled way to include prior physical knowledge in form of partial differential equations (PDEs) into neural networks.
We present a mesh-free and adaptive approach termed particle-density PINN (pdPINN) which is inspired by the microscopic viewpoint of fluid dynamics.
arXiv Detail & Related papers (2022-06-03T12:45:47Z) - InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering [55.70938412352287]
We present an information-theoretic regularization technique for few-shot novel view synthesis based on neural implicit representation.
The proposed approach minimizes potential reconstruction inconsistency that happens due to insufficient viewpoints.
We achieve consistently improved performance compared to existing neural view synthesis methods by large margins on multiple standard benchmarks.
arXiv Detail & Related papers (2021-12-31T11:56:01Z) - NeRF in detail: Learning to sample for view synthesis [104.75126790300735]
Neural radiance fields (NeRF) methods have demonstrated impressive novel view synthesis.
In this work we address a clear limitation of the vanilla coarse-to-fine approach -- that it is based on a performance and not trained end-to-end for the task at hand.
We introduce a differentiable module that learns to propose samples and their importance for the fine network, and consider and compare multiple alternatives for its neural architecture.
arXiv Detail & Related papers (2021-06-09T17:59:10Z) - Learning High-Dimensional Distributions with Latent Neural Fokker-Planck
Kernels [67.81799703916563]
We introduce new techniques to formulate the problem as solving Fokker-Planck equation in a lower-dimensional latent space.
Our proposed model consists of latent-distribution morphing, a generator and a parameterized Fokker-Planck kernel function.
arXiv Detail & Related papers (2021-05-10T17:42:01Z) - Deep Magnification-Flexible Upsampling over 3D Point Clouds [103.09504572409449]
We propose a novel end-to-end learning-based framework to generate dense point clouds.
We first formulate the problem explicitly, which boils down to determining the weights and high-order approximation errors.
Then, we design a lightweight neural network to adaptively learn unified and sorted weights as well as the high-order refinements.
arXiv Detail & Related papers (2020-11-25T14:00:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.